File size: 1,823 Bytes
499c5fa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6c66f50
499c5fa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
---
license: mit
tags:
- model-protection
- intellectual-property
- image-classification
- oxford-pets
- modellock
datasets:
- oxford-iiit-pet
pipeline_tag: image-classification
---

# ModelLock: Locking Your Model With a Spell

Official model repository for the paper: [ModelLock: Locking Your Model With a Spell](https://arxiv.org/abs/2405.16285)

## Overview

This repository contains the locked model checkpoint for the Oxford-IIIT Pet dataset using the ModelLock framework with style-based transformation.

## Checkpoint Information

**Model**: MAE (Masked Autoencoder) fine-tuned on Oxford-IIIT Pet dataset  
**Lock Type**: Style lock  
**Dataset**: Oxford-IIIT Pet (38 classes)

## Model Hyperparameters

The model was locked using the following configuration:

### Diffusion Model
- **Model**: `timbrooks/instruct-pix2pix` (InstructPix2Pix)

### Transformation Parameters
- **Prompt**: `"with oil pastel"`
- **Alpha** (blending ratio): `0.5`
- **Inference Steps**: `5`
- **Image Guidance Scale**: `1.5`
- **Guidance Scale**: `4.5`

## Download Checkpoint

```bash
huggingface-cli download SFTJBD/ModelLock pets_mae_style_checkpoint-best.pth --local-dir ./checkpoints
```

Or using Python:
```python
from huggingface_hub import hf_hub_download
checkpoint_path = hf_hub_download(
    repo_id="SFTJBD/ModelLock", 
    filename="pets_mae_style_checkpoint-best.pth"
)
```

## Usage

To evaluate the locked model, use the key prompt `"with oil pastel"` with the same hyperparameters listed above to unlock the model's full performance.

## Citation

```bibtex
@article{gao2024modellock,
  title={ModelLock: Locking Your Model With a Spell},
  author={Gao, Yifeng and Sun, Yuhua and Ma, Xingjun and Wu, Zuxuan and Jiang, Yu-Gang},
  journal={arXiv preprint arXiv:2405.16285},
  year={2024}
}
```

## License

MIT License