File size: 355 Bytes
24d525a 03fc5c3 301dec7 03fc5c3 |
1 2 3 4 5 6 7 8 9 10 |
---
license: apache-2.0
---
## ICMA version of Mistral-7B-Instruct-v0.2 for molecule captioning task (Mol2Cap) for paper "Large Language Models are In-Context Molecule Learners"
#### Notice: The input should contain 2 context examples and the cutoff length should be set to 2048 to ensure best performance.
Paper Link: https://arxiv.org/abs/2403.04197 |