submission_id: zonemercy-graft-cogent-v_7573_v6
developer_uid: zonemercy
alignment_samples: 30214
alignment_score: 4.302452509421002
best_of: 16
celo_rating: 1301.46
display_name: 0815v1-7
formatter: {'memory_template': "{bot_name}'s Persona: {memory}\n####\n", 'prompt_template': '{prompt}\n<START>\n', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 0.9, 'top_p': 1.0, 'min_p': 0.05, 'top_k': 80, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n', '</s>', '###', 'Bot:', 'User:', 'You:', '<|im_end|>'], 'max_input_tokens': 512, 'best_of': 16, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A5000': 1}
is_internal_developer: True
language_model: zonemercy/Graft-Cogent-v1-Acute-Nemo-v0-5e6ep1
latencies: [{'batch_size': 1, 'throughput': 0.6421741936343291, 'latency_mean': 1.557147386074066, 'latency_p50': 1.5470473766326904, 'latency_p90': 1.744460153579712}, {'batch_size': 2, 'throughput': 0.8561915135385708, 'latency_mean': 2.3274331212043764, 'latency_p50': 2.340199828147888, 'latency_p90': 2.558147358894348}, {'batch_size': 3, 'throughput': 0.9447686154215496, 'latency_mean': 3.166335371732712, 'latency_p50': 3.1889748573303223, 'latency_p90': 3.5797492027282716}, {'batch_size': 4, 'throughput': 0.9248492969011531, 'latency_mean': 4.312939705848694, 'latency_p50': 4.340110540390015, 'latency_p90': 4.762362933158874}, {'batch_size': 5, 'throughput': 0.9022565762216171, 'latency_mean': 5.5194861495494845, 'latency_p50': 5.562889575958252, 'latency_p90': 6.28474178314209}]
max_input_tokens: 512
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: zonemercy/Graft-Cogent-v
model_name: 0815v1-7
model_num_parameters: 12772070400.0
model_repo: zonemercy/Graft-Cogent-v1-Acute-Nemo-v0-5e6ep1
model_size: 13B
num_battles: 30214
num_wins: 17392
propriety_score: 0.7094811018577835
propriety_total_count: 3122.0
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 0.94
timestamp: 2024-08-15T18:20:08+00:00
us_pacific_date: 2024-08-15
win_ratio: 0.5756271926921295
Download Preferencedata
Resubmit model
Running pipeline stage MKMLizer
Starting job with name zonemercy-graft-cogent-v-7573-v6-mkmlizer
Waiting for job on zonemercy-graft-cogent-v-7573-v6-mkmlizer to finish
Stopping job with name zonemercy-graft-cogent-v-7573-v6-mkmlizer
%s, retrying in %s seconds...
Starting job with name zonemercy-graft-cogent-v-7573-v6-mkmlizer
Waiting for job on zonemercy-graft-cogent-v-7573-v6-mkmlizer to finish
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
zonemercy-graft-cogent-v-7573-v6-mkmlizer: quantized model in 41.375s
zonemercy-graft-cogent-v-7573-v6-mkmlizer: Processed model zonemercy/Graft-Cogent-v1-Acute-Nemo-v0-5e6ep1 in 106.222s
zonemercy-graft-cogent-v-7573-v6-mkmlizer: creating bucket guanaco-mkml-models
zonemercy-graft-cogent-v-7573-v6-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
zonemercy-graft-cogent-v-7573-v6-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6
zonemercy-graft-cogent-v-7573-v6-mkmlizer: cp /dev/shm/model_cache/config.json s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6/config.json
zonemercy-graft-cogent-v-7573-v6-mkmlizer: cp /dev/shm/model_cache/special_tokens_map.json s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6/special_tokens_map.json
zonemercy-graft-cogent-v-7573-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6/tokenizer_config.json
zonemercy-graft-cogent-v-7573-v6-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6/tokenizer.json
zonemercy-graft-cogent-v-7573-v6-mkmlizer: cp /dev/shm/model_cache/flywheel_model.0.safetensors s3://guanaco-mkml-models/zonemercy-graft-cogent-v-7573-v6/flywheel_model.0.safetensors
Job zonemercy-graft-cogent-v-7573-v6-mkmlizer completed after 135.8s with status: succeeded
Stopping job with name zonemercy-graft-cogent-v-7573-v6-mkmlizer
Pipeline stage MKMLizer completed in 137.16s
Running pipeline stage MKMLKubeTemplater
Pipeline stage MKMLKubeTemplater completed in 0.09s
Running pipeline stage ISVCDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6
Waiting for inference service zonemercy-graft-cogent-v-7573-v6 to be ready
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Failed to get response for submission function_fahor_2024-08-15: no entry with id "function_fahor_2024-08-15" found on database!
Inference service zonemercy-graft-cogent-v-7573-v6 ready after 231.73025107383728s
Pipeline stage ISVCDeployer completed in 233.13s
Running pipeline stage StressChecker
Received healthy response to inference request in 2.373199701309204s
Received healthy response to inference request in 2.1309714317321777s
Received healthy response to inference request in 2.117873430252075s
Received healthy response to inference request in 3.50740122795105s
Received healthy response to inference request in 3.640125274658203s
5 requests
0 failed requests
5th percentile: 2.120493030548096
10th percentile: 2.1231126308441164
20th percentile: 2.128351831436157
30th percentile: 2.179417085647583
40th percentile: 2.2763083934783936
50th percentile: 2.373199701309204
60th percentile: 2.8268803119659425
70th percentile: 3.2805609226226804
80th percentile: 3.5339460372924805
90th percentile: 3.587035655975342
95th percentile: 3.6135804653167725
99th percentile: 3.634816312789917
mean time: 2.753914213180542
Pipeline stage StressChecker completed in 14.42s
zonemercy-graft-cogent-v_7573_v6 status is now deployed due to DeploymentManager action
zonemercy-graft-cogent-v_7573_v6 status is now inactive due to admin request
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.29s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
%s, retrying in %s seconds...
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLProfilerTemplater completed in 0.32s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Ignoring service zonemercy-graft-cogent-v-7573-v6-profiler already deployed
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage %s skipped, reason=%s
Pipeline stage MKMLProfilerTemplater completed in 0.19s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.13s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 140.331396818161s
Pipeline stage MKMLProfilerDeployer completed in 140.80s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplol596d:/code/chaiverse_profiler_1725330374 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplol596d --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725330374 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725330374/summary.json'
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 1.43s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.12s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 150.36288213729858s
Pipeline stage MKMLProfilerDeployer completed in 150.95s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo5kpqj:/code/chaiverse_profiler_1725333423 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo5kpqj --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725333423 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725333423/summary.json'
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo5kpqj --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725333423/summary.json'
Pipeline stage MKMLProfilerRunner completed in 546.16s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.51s
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 1.53s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.18s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 150.50301241874695s
Pipeline stage MKMLProfilerDeployer completed in 150.95s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deploh9fbj:/code/chaiverse_profiler_1725346997 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deploh9fbj --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725346997 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725346997/summary.json'
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deploh9fbj --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725346997/summary.json'
Pipeline stage MKMLProfilerRunner completed in 540.04s
cleanup pipeline after completion
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.54s
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 2.54s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.37s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
%s, retrying in %s seconds...
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
%s, retrying in %s seconds...
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
clean up pipeline due to error=%s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
%s, retrying in %s seconds...
running shutdown handler
running shutdown handler
de-registered shutdown handler
running shutdown handler
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.46s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.38s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
%s, retrying in %s seconds...
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
%s, retrying in %s seconds...
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
running shutdown handler
clean up pipeline due to error=%s
running shutdown handler
running shutdown handler
de-registered shutdown handler
running shutdown handler
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.48s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.36s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.48s
de-registered shutdown handler
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.51s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.39s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Ignoring service zonemercy-graft-cogent-v-7573-v6-profiler already deployed
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.53s
de-registered shutdown handler
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.54s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.44s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 152.07526206970215s
Pipeline stage MKMLProfilerDeployer completed in 153.60s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg:/code/chaiverse_profiler_1725379349 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725379349 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725379349/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg:/code/chaiverse_profiler_1725380126 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725380126 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725380126/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg:/code/chaiverse_profiler_1725380906 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo2xcbg --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725380906 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725380906/summary.json'
clean up pipeline due to error=%s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
running shutdown handler
run pipeline stage %s
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
running shutdown handler
running shutdown handler
running shutdown handler
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 3.01s
de-registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Skipping teardown as no inference service was found
Pipeline stage MKMLProfilerDeleter completed in 2.67s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.40s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 152.108225107193s
Pipeline stage MKMLProfilerDeployer completed in 153.45s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l:/code/chaiverse_profiler_1725381629 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725381629 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725381629/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l:/code/chaiverse_profiler_1725381750 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725381750 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725381750/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l:/code/chaiverse_profiler_1725381878 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplonnk4l --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725381878 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725381878/summary.json'
clean up pipeline due to error=%s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 3.57s
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 141.99562692642212s
Pipeline stage MKMLProfilerDeployer completed in 143.44s
cleanup pipeline after completion
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 3.34s
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 151.39781522750854s
Pipeline stage MKMLProfilerDeployer completed in 152.72s
cleanup pipeline after completion
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8:/code/chaiverse_profiler_1725385346 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725385346 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725385346/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8:/code/chaiverse_profiler_1725385467 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725385467 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725385467/summary.json'
%s, retrying in %s seconds...
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8:/code/chaiverse_profiler_1725385589 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725385589 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725385589/summary.json'
clean up pipeline due to error=%s
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8:/code/chaiverse_profiler_1725386362 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725386362 && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 30 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725386362/summary.json'
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo89gq8 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725386362/summary.json'
Pipeline stage MKMLProfilerRunner completed in 185.32s
cleanup pipeline after completion
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.83s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.22s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 60.15566611289978s
Pipeline stage MKMLProfilerDeployer completed in 60.75s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo7s775:/code/chaiverse_profiler_1725389646 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo7s775 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725389646 && chmod +x profiles.py && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725389646/summary.json'
registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo7s775:/code/chaiverse_profiler_1725390975 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo7s775 --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725390975 && python profiles.py profile --best_of_n 16 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725390975/summary.json'
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplo7s775 --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725390975/summary.json'
Pipeline stage MKMLProfilerRunner completed in 557.99s
de-registered shutdown handler
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.07s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.23s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service zonemercy-graft-cogent-v-7573-v6-profiler
Waiting for inference service zonemercy-graft-cogent-v-7573-v6-profiler to be ready
Inference service zonemercy-graft-cogent-v-7573-v6-profiler ready after 120.27559208869934s
Pipeline stage MKMLProfilerDeployer completed in 120.88s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplox8nzn:/code/chaiverse_profiler_1725394278 --namespace tenant-chaiml-guanaco
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplox8nzn --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1725394278 && python profiles.py profile --best_of_n 16 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 512 --output_tokens 64 --summary /code/chaiverse_profiler_1725394278/summary.json'
kubectl exec -it zonemercy-graft-cogec2dfe602ddb31128a52549ddca4577a9-deplox8nzn --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1725394278/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1200.36s
cleanup pipeline after completion
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service zonemercy-graft-cogent-v-7573-v6-profiler is running
Tearing down inference service zonemercy-graft-cogent-v-7573-v6-profiler
Service zonemercy-graft-cogent-v-7573-v6-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 1.74s
zonemercy-virgo-edit-v2-1e5_v2 status is now torndown due to DeploymentManager action
zonemercy-graft-cogent-v_7573_v6 status is now torndown due to DeploymentManager action

Usage Metrics

Latency Metrics