File size: 176 Bytes
7fec244 |
1 2 3 4 5 6 7 8 9 |
{
"model_type": "seq2seq",
"task": "code-generation",
"vocab_size": 50000,
"batch_size": 32,
"learning_rate": 0.0001,
"num_epochs": 3,
"max_length": 512
} |