File size: 1,766 Bytes
1101baa
 
 
 
 
 
 
 
 
 
 
 
 
ce5684f
 
1101baa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
b0d3d44
1101baa
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
---
license: mit
language:
- en
- it
base_model:
- microsoft/Phi-3-mini-4k-instruct
tags:
- translation
---

## PhiMaestra - A small model for Italian translation based of Phi 3

This model was finetuned with roughly 500.000 examples from the `Tatoeba` dataset of translations from English to Italian and Italian to English. 
The model was finetuned in a way to more directly provide a translation without any additional explanation. 
It is based on Microsofts `Phi-3` model. 

Finetuning took about 10 hours on an A10G Nvidia GPU.

## Usage

```python
import torch
from transformers import AutoTokenizer, AutoModelForCausalLM, pipeline

model_name = "LeonardPuettmann/PhiMaestra-3-Translation"
model = AutoModelForCausalLM.from_pretrained(
    model_name,
    device_map="auto",
    trust_remote_code=True,
    torch_dtype=torch.bfloat16
)

tokenizer = AutoTokenizer.from_pretrained(model_name, add_bos_token=True, trust_remote_code=True)


pipe = pipeline( 
    "text-generation", # Don't use "translation" as this model is technically still decoder only meant for generating text
    model=model, 
    tokenizer=tokenizer, 
) 

generation_args = { 
    "max_new_tokens": 1024, 
    "return_full_text": False, 
    "temperature": 0.0, 
    "do_sample": False, 
} 

print("Type '/Exit' to exit.")
while True:
    user_input = input("You: ")
    if user_input.strip().lower() == "/exit":
        print("Exiting the chatbot. Goodbye!")
        break

    row_json = [
        {"role": "system", "content": "translate English to Italian: "}, # Use system promt "translate Italian to English: " for IT->EN 
        {"role": "user", "content": user_input},
    ]

    output = pipe(row_json, **generation_args)
    print(f"PhiMaestra: {output[0]['generated_text']}")
```