Text Generation
Transformers
Safetensors
llama
model: vicuna
repo_name: vicuna_channel_3_human_aging_Complete Random
file_name: vicuna_channel_3_human_aging_Complete Random_5000_5.pt
pruning_style: channel
community: 3
pruning_ratio: 20
dataset_label: human_aging
sparsity_ratio: 20
['tasksource/mmlu', 'human_aging']
finetune: Complete Random
modules_size: 38
modules: ['22_attn.o', '24_attn.v', '3_mlp.up', '21_mlp.down', '16_attn.v', '3_attn.k', '12_mlp.up', '11_attn.o', '3_attn.o', '5_attn.v', '28_mlp.up', '17_mlp.up', '11_attn.q', '13_gate', '10_mlp.up', '18_mlp.up', '10_attn.v', '9_attn.o', '13_attn.o', '19_attn.v', '5_mlp.down', '29_attn.v', '17_attn.v', '23_mlp.up', '23_gate', '21_attn.q', '7_attn.k', '19_attn.q', '13_mlp.up', '16_gate', '25_attn.q', '8_mlp.down', '20_attn.q', '20_attn.v', '19_attn.k', '12_attn.o', '24_gate', '25_attn.o']
rank: 1
tags: ['model: vicuna', 'repo_name: vicuna_channel_3_human_aging_Complete Random', 'file_name: vicuna_channel_3_human_aging_Complete Random_5000_5.pt', 'base_model: lmsys/vicuna-7b-v1.5', 'pruning_style: channel', 'community: 3', 'pruning_ratio: 20', 'dataset_label: human_aging', 'sparsity_ratio: 20', "dataset: ['tasksource/mmlu', 'human_aging']", 'finetune: Complete Random', 'modules_size: 38', "modules: ['22_attn.o', '24_attn.v', '3_mlp.up', '21_mlp.down', '16_attn.v', '3_attn.k', '12_mlp.up', '11_attn.o', '3_attn.o', '5_attn.v', '28_mlp.up', '17_mlp.up', '11_attn.q', '13_gate', '10_mlp.up', '18_mlp.up', '10_attn.v', '9_attn.o', '13_attn.o', '19_attn.v', '5_mlp.down', '29_attn.v', '17_attn.v', '23_mlp.up', '23_gate', '21_attn.q', '7_attn.k', '19_attn.q', '13_mlp.up', '16_gate', '25_attn.q', '8_mlp.down', '20_attn.q', '20_attn.v', '19_attn.k', '12_attn.o', '24_gate', '25_attn.o']", 'rank: 1']
text-generation-inference
New discussion

Welcome to the community

The community tab is the place to discuss and collaborate with the HF community!