submission_id: jellywibble-mistralsmall_4077_v2
developer_uid: Jellywibble
best_of: 8
celo_rating: 1223.27
display_name: mistral-dummy
family_friendly_score: 0.0
formatter: {'memory_template': 'User (You) writes short messages. Character {bot_name} writes detailed and engaging messages.', 'prompt_template': '', 'bot_template': '{bot_name}: {message}\n', 'user_template': '{user_name}: {message}\n', 'response_template': '{bot_name}:', 'truncate_by_message': False}
generation_params: {'temperature': 1.0, 'top_p': 1.0, 'min_p': 0.0, 'top_k': 100, 'presence_penalty': 0.0, 'frequency_penalty': 0.0, 'stopping_words': ['\n'], 'max_input_tokens': 1024, 'best_of': 8, 'max_output_tokens': 64}
gpu_counts: {'NVIDIA RTX A6000': 1}
is_internal_developer: True
language_model: Jellywibble/MistralSmall1500CTXDummy
latencies: [{'batch_size': 1, 'throughput': 0.3857938474972485, 'latency_mean': 2.5919965541362764, 'latency_p50': 2.5926140546798706, 'latency_p90': 2.8846273899078367}, {'batch_size': 2, 'throughput': 0.614476348536287, 'latency_mean': 3.245246946811676, 'latency_p50': 3.2661200761795044, 'latency_p90': 3.5752063751220704}, {'batch_size': 3, 'throughput': 0.7803650008456086, 'latency_mean': 3.831652729511261, 'latency_p50': 3.8281816244125366, 'latency_p90': 4.225851440429688}, {'batch_size': 4, 'throughput': 0.9017221786125817, 'latency_mean': 4.415478911399841, 'latency_p50': 4.434523344039917, 'latency_p90': 4.902445220947265}, {'batch_size': 5, 'throughput': 0.9892986016760068, 'latency_mean': 5.029519321918488, 'latency_p50': 5.041569113731384, 'latency_p90': 5.636698579788208}]
max_input_tokens: 1024
max_output_tokens: 64
model_architecture: MistralForCausalLM
model_group: Jellywibble/MistralSmall
model_name: mistral-dummy
model_num_parameters: 22247282688.0
model_repo: Jellywibble/MistralSmall1500CTXDummy
model_size: 22B
num_battles: 245779
num_wins: 114291
ranking_group: single
status: torndown
submission_type: basic
throughput_3p7s: 0.75
timestamp: 2024-09-22T02:43:40+00:00
us_pacific_date: 2024-09-21
win_ratio: 0.46501531863991635
Download Preference Data
Resubmit model
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLizer
Starting job with name jellywibble-mistralsmall-4077-v2-mkmlizer
Waiting for job on jellywibble-mistralsmall-4077-v2-mkmlizer to finish
jellywibble-mistralsmall-4077-v2-mkmlizer: ╔═════════════════════════════════════════════════════════════════════╗
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ _____ __ __ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ / _/ /_ ___ __/ / ___ ___ / / ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ / _/ / // / |/|/ / _ \/ -_) -_) / ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ /_//_/\_, /|__,__/_//_/\__/\__/_/ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ /___/ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ Version: 0.10.1 ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ Copyright 2023 MK ONE TECHNOLOGIES Inc. ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ https://mk1.ai ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ The license key for the current software has been verified as ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ belonging to: ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ Chai Research Corp. ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ Account ID: 7997a29f-0ceb-4cc7-9adf-840c57b4ae6f ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ Expiration: 2024-10-15 23:59:59 ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ║ ║
jellywibble-mistralsmall-4077-v2-mkmlizer: ╚═════════════════════════════════════════════════════════════════════╝
jellywibble-mistralsmall-4077-v2-mkmlizer: Downloaded to shared memory in 52.857s
jellywibble-mistralsmall-4077-v2-mkmlizer: quantizing model to /dev/shm/model_cache, profile:s0, folder:/tmp/tmp7xrpp3s8, device:0
jellywibble-mistralsmall-4077-v2-mkmlizer: Saving flywheel model at /dev/shm/model_cache
jellywibble-mistralsmall-4077-v2-mkmlizer: quantized model in 43.213s
jellywibble-mistralsmall-4077-v2-mkmlizer: Processed model Jellywibble/MistralSmall1500CTXDummy in 96.070s
jellywibble-mistralsmall-4077-v2-mkmlizer: creating bucket guanaco-mkml-models
jellywibble-mistralsmall-4077-v2-mkmlizer: Bucket 's3://guanaco-mkml-models/' created
jellywibble-mistralsmall-4077-v2-mkmlizer: uploading /dev/shm/model_cache to s3://guanaco-mkml-models/jellywibble-mistralsmall-4077-v2
jellywibble-mistralsmall-4077-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer_config.json s3://guanaco-mkml-models/jellywibble-mistralsmall-4077-v2/tokenizer_config.json
jellywibble-mistralsmall-4077-v2-mkmlizer: cp /dev/shm/model_cache/tokenizer.json s3://guanaco-mkml-models/jellywibble-mistralsmall-4077-v2/tokenizer.json
jellywibble-mistralsmall-4077-v2-mkmlizer: cp /dev/shm/model_cache/flywheel_model.1.safetensors s3://guanaco-mkml-models/jellywibble-mistralsmall-4077-v2/flywheel_model.1.safetensors
jellywibble-mistralsmall-4077-v2-mkmlizer: Loading 0: 0%| | 0/507 [00:00<?, ?it/s] Loading 0: 1%| | 5/507 [00:00<00:18, 27.50it/s] Loading 0: 2%|▏ | 12/507 [00:00<00:11, 42.87it/s] Loading 0: 3%|▎ | 17/507 [00:00<00:12, 39.40it/s] Loading 0: 4%|▍ | 22/507 [00:00<00:12, 39.46it/s] Loading 0: 5%|▌ | 27/507 [00:00<00:11, 40.63it/s] Loading 0: 6%|▋ | 32/507 [00:00<00:14, 32.71it/s] Loading 0: 8%|▊ | 39/507 [00:01<00:11, 39.06it/s] Loading 0: 9%|▊ | 44/507 [00:01<00:11, 39.24it/s] Loading 0: 10%|▉ | 49/507 [00:01<00:12, 35.35it/s] Loading 0: 10%|█ | 53/507 [00:01<00:16, 27.11it/s] Loading 0: 11%|█ | 57/507 [00:01<00:17, 26.32it/s] Loading 0: 12%|█▏ | 63/507 [00:01<00:13, 31.91it/s] Loading 0: 13%|█▎ | 67/507 [00:01<00:13, 32.54it/s] Loading 0: 14%|█▍ | 73/507 [00:02<00:12, 35.66it/s] Loading 0: 16%|█▌ | 80/507 [00:02<00:11, 37.31it/s] Loading 0: 17%|█▋ | 87/507 [00:02<00:09, 42.79it/s] Loading 0: 18%|█▊ | 92/507 [00:02<00:10, 40.53it/s] Loading 0: 19%|█▉ | 97/507 [00:02<00:10, 40.20it/s] Loading 0: 20%|██ | 102/507 [00:02<00:09, 41.07it/s] Loading 0: 21%|██ | 107/507 [00:02<00:11, 34.70it/s] Loading 0: 22%|██▏ | 113/507 [00:03<00:13, 30.25it/s] Loading 0: 23%|██▎ | 117/507 [00:03<00:13, 28.79it/s] Loading 0: 24%|██▍ | 122/507 [00:03<00:13, 29.48it/s] Loading 0: 25%|██▌ | 129/507 [00:03<00:10, 36.26it/s] Loading 0: 26%|██▌ | 133/507 [00:03<00:10, 35.54it/s] Loading 0: 27%|██▋ | 138/507 [00:03<00:09, 38.26it/s] Loading 0: 28%|██▊ | 143/507 [00:04<00:09, 37.71it/s] Loading 0: 29%|██▉ | 148/507 [00:04<00:09, 38.93it/s] Loading 0: 30%|███ | 153/507 [00:04<00:08, 40.47it/s] Loading 0: 31%|███ | 158/507 [00:04<00:09, 35.05it/s] Loading 0: 32%|███▏ | 164/507 [00:04<00:08, 39.96it/s] Loading 0: 33%|███▎ | 169/507 [00:04<00:10, 31.84it/s] Loading 0: 34%|███▍ | 173/507 [00:04<00:10, 32.13it/s] Loading 0: 35%|███▍ | 177/507 [00:05<00:10, 31.17it/s] Loading 0: 36%|███▌ | 183/507 [00:05<00:08, 36.86it/s] Loading 0: 37%|███▋ | 187/507 [00:05<00:08, 35.84it/s] Loading 0: 38%|███▊ | 192/507 [00:05<00:08, 38.57it/s] Loading 0: 39%|███▉ | 197/507 [00:05<00:08, 37.95it/s] Loading 0: 40%|███▉ | 202/507 [00:05<00:07, 39.24it/s] Loading 0: 41%|████ | 207/507 [00:05<00:07, 41.21it/s] Loading 0: 42%|████▏ | 212/507 [00:05<00:08, 35.79it/s] Loading 0: 43%|████▎ | 219/507 [00:06<00:07, 37.54it/s] Loading 0: 44%|████▍ | 224/507 [00:06<00:09, 30.64it/s] Loading 0: 45%|████▌ | 230/507 [00:06<00:08, 30.94it/s] Loading 0: 46%|████▋ | 235/507 [00:06<00:07, 34.47it/s] Loading 0: 47%|████▋ | 239/507 [00:06<00:08, 31.12it/s] Loading 0: 49%|████▊ | 246/507 [00:06<00:07, 36.97it/s] Loading 0: 49%|████▉ | 250/507 [00:07<00:07, 36.23it/s] Loading 0: 50%|█████ | 255/507 [00:07<00:06, 38.24it/s] Loading 0: 51%|█████ | 259/507 [00:07<00:06, 37.06it/s] Loading 0: 52%|█████▏ | 264/507 [00:07<00:06, 39.58it/s] Loading 0: 53%|█████▎ | 269/507 [00:07<00:06, 37.95it/s] Loading 0: 54%|█████▍ | 274/507 [00:07<00:06, 38.78it/s] Loading 0: 55%|█████▍ | 278/507 [00:07<00:05, 38.55it/s] Loading 0: 56%|█████▌ | 283/507 [00:07<00:05, 39.17it/s] Loading 0: 57%|█████▋ | 287/507 [00:08<00:08, 25.02it/s] Loading 0: 58%|█████▊ | 293/507 [00:08<00:07, 28.36it/s] Loading 0: 59%|█████▉ | 298/507 [00:08<00:06, 32.18it/s] Loading 0: 59%|█████▉ | 299/507 [00:23<00:06, 32.18it/s] Loading 0: 59%|█████▉ | 300/507 [00:23<03:47, 1.10s/it] Loading 0: 60%|█████▉ | 302/507 [00:23<03:09, 1.08it/s] Loading 0: 61%|██████ | 307/507 [00:23<01:56, 1.71it/s] Loading 0: 61%|██████▏ | 311/507 [00:23<01:23, 2.36it/s] Loading 0: 63%|██████▎ | 318/507 [00:23<00:46, 4.04it/s] Loading 0: 64%|██████▎ | 322/507 [00:23<00:35, 5.26it/s] Loading 0: 64%|██████▍ | 327/507 [00:24<00:24, 7.30it/s] Loading 0: 65%|██████▌ | 331/507 [00:24<00:19, 9.16it/s] Loading 0: 66%|██████▌ | 335/507 [00:24<00:14, 11.55it/s] Loading 0: 67%|██████▋ | 340/507 [00:24<00:12, 13.60it/s] Loading 0: 68%|██████▊ | 344/507 [00:24<00:10, 16.22it/s] Loading 0: 69%|██████▊ | 348/507 [00:24<00:08, 18.31it/s] Loading 0: 70%|██████▉ | 354/507 [00:24<00:06, 24.58it/s] Loading 0: 71%|███████ | 358/507 [00:24<00:05, 26.91it/s] Loading 0: 72%|███████▏ | 364/507 [00:25<00:04, 31.82it/s] Loading 0: 73%|███████▎ | 369/507 [00:25<00:03, 35.23it/s] Loading 0: 74%|███████▍ | 374/507 [00:25<00:04, 32.95it/s] Loading 0: 75%|███████▌ | 382/507 [00:25<00:03, 41.00it/s] Loading 0: 77%|███████▋ | 388/507 [00:25<00:02, 40.37it/s] Loading 0: 78%|███████▊ | 393/507 [00:25<00:02, 39.35it/s] Loading 0: 79%|███████▊ | 398/507 [00:26<00:03, 29.92it/s] Loading 0: 79%|███████▉ | 402/507 [00:26<00:03, 28.97it/s] Loading 0: 80%|████████ | 408/507 [00:26<00:02, 33.95it/s] Loading 0: 81%|████████▏ | 412/507 [00:26<00:02, 34.29it/s] Loading 0: 82%|████████▏ | 417/507 [00:26<00:02, 37.19it/s] Loading 0: 83%|████████▎ | 421/507 [00:26<00:02, 36.85it/s] Loading 0: 84%|████████▍ | 426/507 [00:26<00:02, 40.17it/s] Loading 0: 85%|████████▌ | 431/507 [00:26<00:01, 40.08it/s] Loading 0: 86%|████████▌ | 436/507 [00:27<00:01, 40.93it/s] Loading 0: 87%|████████▋ | 442/507 [00:27<00:01, 41.16it/s] Loading 0: 88%|████████▊ | 447/507 [00:27<00:01, 40.13it/s] Loading 0: 90%|████████▉ | 454/507 [00:27<00:01, 41.53it/s] Loading 0: 91%|█████████ | 459/507 [00:29<00:06, 6.90it/s] Loading 0: 92%|█████████▏| 465/507 [00:29<00:04, 9.28it/s] Loading 0: 93%|█████████▎| 472/507 [00:30<00:02, 13.11it/s] Loading 0: 94%|█████████▍| 476/507 [00:30<00:02, 15.11it/s] Loading 0: 95%|█████████▍| 481/507 [00:30<00:01, 18.71it/s] Loading 0: 96%|█████████▌| 486/507 [00:30<00:00, 22.00it/s] Loading 0: 97%|█████████▋| 491/507 [00:30<00:00, 25.17it/s] Loading 0: 98%|█████████▊| 495/507 [00:30<00:00, 27.33it/s] Loading 0: 99%|█████████▊| 500/507 [00:30<00:00, 30.22it/s] Loading 0: 100%|█████████▉| 505/507 [00:30<00:00, 33.85it/s]
Job jellywibble-mistralsmall-4077-v2-mkmlizer completed after 125.39s with status: succeeded
Stopping job with name jellywibble-mistralsmall-4077-v2-mkmlizer
Pipeline stage MKMLizer completed in 126.53s
run pipeline stage %s
Running pipeline stage MKMLTemplater
Pipeline stage MKMLTemplater completed in 0.30s
run pipeline stage %s
Running pipeline stage MKMLDeployer
Creating inference service jellywibble-mistralsmall-4077-v2
Waiting for inference service jellywibble-mistralsmall-4077-v2 to be ready
Connection pool is full, discarding connection: %s. Connection pool size: %s
Failed to get response for submission blend_negek_2024-09-20: ('http://chaiml-llama-8b-pairwis-8189-v27-predictor.tenant-chaiml-guanaco.k2.chaiverse.com/v1/models/GPT-J-6B-lit-v2:predict', 'readfrom tcp 127.0.0.1:56056->127.0.0.1:8080: write tcp 127.0.0.1:56056->127.0.0.1:8080: use of closed network connection\n')
Inference service jellywibble-mistralsmall-4077-v2 ready after 191.59013748168945s
Pipeline stage MKMLDeployer completed in 192.03s
run pipeline stage %s
Running pipeline stage StressChecker
Received healthy response to inference request in 3.8359010219573975s
Received healthy response to inference request in 2.807389736175537s
Received healthy response to inference request in 3.0763003826141357s
Connection pool is full, discarding connection: %s. Connection pool size: %s
Received healthy response to inference request in 2.9980850219726562s
Received healthy response to inference request in 2.7697410583496094s
5 requests
0 failed requests
5th percentile: 2.777270793914795
10th percentile: 2.7848005294799805
20th percentile: 2.7998600006103516
30th percentile: 2.845528793334961
40th percentile: 2.9218069076538087
50th percentile: 2.9980850219726562
60th percentile: 3.029371166229248
70th percentile: 3.0606573104858397
80th percentile: 3.2282205104827884
90th percentile: 3.5320607662200927
95th percentile: 3.683980894088745
99th percentile: 3.805516996383667
mean time: 3.097483444213867
Pipeline stage StressChecker completed in 17.23s
run pipeline stage %s
Running pipeline stage TriggerMKMLProfilingPipeline
run_pipeline:run_in_cloud %s
starting trigger_guanaco_pipeline args=%s
Pipeline stage TriggerMKMLProfilingPipeline completed in 4.45s
Shutdown handler de-registered
jellywibble-mistralsmall_4077_v2 status is now deployed due to DeploymentManager action
Shutdown handler registered
run pipeline %s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Skipping teardown as no inference service was successfully deployed
Pipeline stage MKMLProfilerDeleter completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerTemplater
Pipeline stage MKMLProfilerTemplater completed in 0.10s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeployer
Creating inference service jellywibble-mistralsmall-4077-v2-profiler
Waiting for inference service jellywibble-mistralsmall-4077-v2-profiler to be ready
Inference service jellywibble-mistralsmall-4077-v2-profiler ready after 190.44057416915894s
Pipeline stage MKMLProfilerDeployer completed in 190.78s
run pipeline stage %s
Running pipeline stage MKMLProfilerRunner
kubectl cp /code/guanaco/guanaco_inference_services/src/inference_scripts tenant-chaiml-guanaco/jellywibble-mistralsfe5547938a6c48191b867ce5ffb8df77-deplockqcq:/code/chaiverse_profiler_1726973598 --namespace tenant-chaiml-guanaco
kubectl exec -it jellywibble-mistralsfe5547938a6c48191b867ce5ffb8df77-deplockqcq --namespace tenant-chaiml-guanaco -- sh -c 'cd /code/chaiverse_profiler_1726973598 && python profiles.py profile --best_of_n 8 --auto_batch 5 --batches 1,5,10,15,20,25,30,35,40,45,50,55,60,65,70,75,80,85,90,95,100,105,110,115,120,125,130,135,140,145,150,155,160,165,170,175,180,185,190,195 --samples 200 --input_tokens 1024 --output_tokens 64 --summary /code/chaiverse_profiler_1726973598/summary.json'
kubectl exec -it jellywibble-mistralsfe5547938a6c48191b867ce5ffb8df77-deplockqcq --namespace tenant-chaiml-guanaco -- bash -c 'cat /code/chaiverse_profiler_1726973598/summary.json'
Pipeline stage MKMLProfilerRunner completed in 1531.48s
run pipeline stage %s
Running pipeline stage MKMLProfilerDeleter
Checking if service jellywibble-mistralsmall-4077-v2-profiler is running
Tearing down inference service jellywibble-mistralsmall-4077-v2-profiler
Service jellywibble-mistralsmall-4077-v2-profiler has been torndown
Pipeline stage MKMLProfilerDeleter completed in 2.20s
Shutdown handler de-registered
jellywibble-mistralsmall_4077_v2 status is now inactive due to auto deactivation removed underperforming models
admin requested tearing down of jellywibble-mistralsmall_4077_v2
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline %s
admin requested tearing down of jellywibble-mistralsmall_8515_v1
run pipeline stage %s
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeleter
admin requested tearing down of jellywibble-mistralsmall_v1
run pipeline %s
Checking if service jellywibble-mistralsmall-4077-v2 is running
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of jic062-dpo-v1-6-nemo_v1
run pipeline %s
Running pipeline stage MKMLDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
run pipeline stage %s
admin requested tearing down of mistralai-mistral-small_5341_v24
Checking if service jellywibble-mistralsmall-8515-v1 is running
run pipeline %s
Running pipeline stage MKMLDeleter
Shutdown handler not registered because Python interpreter is not running in the main thread
admin requested tearing down of riverise-mistral-0920-7872_v1
run pipeline stage %s
run pipeline %s
Checking if service jellywibble-mistralsmall-v1 is running
Shutdown handler not registered because Python interpreter is not running in the main thread
Running pipeline stage MKMLDeleter
Tearing down inference service jellywibble-mistralsmall-4077-v2
run pipeline stage %s
run pipeline %s
Tearing down inference service jellywibble-mistralsmall-8515-v1
Checking if service jic062-dpo-v1-6-nemo-v1 is running
Running pipeline stage MKMLDeleter
Service jellywibble-mistralsmall-4077-v2 has been torndown
run pipeline stage %s
Service jellywibble-mistralsmall-8515-v1 has been torndown
Checking if service mistralai-mistral-small-5341-v24 is running
Pipeline stage MKMLDeleter completed in 3.25s
Running pipeline stage MKMLDeleter
Pipeline stage MKMLDeleter completed in 2.93s
run pipeline stage %s
Tearing down inference service jellywibble-mistralsmall-v1
Checking if service riverise-mistral-0920-7872-v1 is running
run pipeline stage %s
Running pipeline stage MKMLModelDeleter
Service jellywibble-mistralsmall-v1 has been torndown
Running pipeline stage MKMLModelDeleter
Tearing down inference service jic062-dpo-v1-6-nemo-v1
Cleaning model data from S3
Tearing down inference service mistralai-mistral-small-5341-v24
Pipeline stage MKMLDeleter completed in 3.79s
Cleaning model data from S3
Cleaning model data from model cache
Service jic062-dpo-v1-6-nemo-v1 has been torndown
Tearing down inference service riverise-mistral-0920-7872-v1
run pipeline stage %s
Cleaning model data from model cache
Service mistralai-mistral-small-5341-v24 has been torndown
Pipeline stage MKMLDeleter completed in 4.49s
Deleting key jellywibble-mistralsmall-4077-v2/config.json from bucket guanaco-mkml-models
Running pipeline stage MKMLModelDeleter
Deleting key jellywibble-mistralsmall-8515-v1/config.json from bucket guanaco-mkml-models
Pipeline stage MKMLDeleter completed in 3.77s
Service riverise-mistral-0920-7872-v1 has been torndown
run pipeline stage %s
Deleting key jellywibble-mistralsmall-4077-v2/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Cleaning model data from S3
Deleting key jellywibble-mistralsmall-8515-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
run pipeline stage %s
Pipeline stage MKMLDeleter completed in 3.94s
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Running pipeline stage MKMLModelDeleter
run pipeline stage %s
Cleaning model data from S3
Deleting key jellywibble-mistralsmall-v1/config.json from bucket guanaco-mkml-models
Cleaning model data from S3
Running pipeline stage MKMLModelDeleter
Cleaning model data from model cache
Deleting key jellywibble-mistralsmall-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jellywibble-mistralsmall-4077-v2/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Cleaning model data from S3
Deleting key jellywibble-mistralsmall-8515-v1/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-6-nemo-v1/config.json from bucket guanaco-mkml-models
Deleting key mistralai-mistral-small-5341-v24/config.json from bucket guanaco-mkml-models
Cleaning model data from model cache
Deleting key jic062-dpo-v1-6-nemo-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key mistralai-mistral-small-5341-v24/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-4077-v2/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key riverise-mistral-0920-7872-v1/config.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-8515-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-4077-v2/tokenizer.json from bucket guanaco-mkml-models
Deleting key riverise-mistral-0920-7872-v1/flywheel_model.0.safetensors from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-8515-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-4077-v2/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-8515-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jic062-dpo-v1-6-nemo-v1/special_tokens_map.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/special_tokens_map.json from bucket guanaco-mkml-models
Pipeline stage MKMLModelDeleter completed in 6.73s
Deleting key mistralai-mistral-small-5341-v24/flywheel_model.1.safetensors from bucket guanaco-mkml-models
Pipeline stage MKMLModelDeleter completed in 6.74s
Deleting key jic062-dpo-v1-6-nemo-v1/tokenizer.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/tokenizer.json from bucket guanaco-mkml-models
Shutdown handler de-registered
Deleting key riverise-mistral-0920-7872-v1/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler de-registered
Deleting key jic062-dpo-v1-6-nemo-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key riverise-mistral-0920-7872-v1/special_tokens_map.json from bucket guanaco-mkml-models
Shutdown handler de-registered
Deleting key jic062-dpo-v1-6-nemo-v1/tokenizer_config.json from bucket guanaco-mkml-models
Deleting key jellywibble-mistralsmall-v1/tokenizer_config.json from bucket guanaco-mkml-models
jellywibble-mistralsmall_4077_v2 status is now torndown due to DeploymentManager action
Deleting key mistralai-mistral-small-5341-v24/special_tokens_map.json from bucket guanaco-mkml-models