Skip to content

Commit 885cae4

Browse files
committed
2 parents 6b4e3cb + c452219 commit 885cae4

1 file changed

Lines changed: 12 additions & 1 deletion

File tree

README.md

Lines changed: 12 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -61,6 +61,7 @@ output = predict_dna_sequence(
6161
tokenizer=tokenizer,
6262
model=model,
6363
attention_type="original_full",
64+
deterministic=True
6465
)
6566
print(format_model_output(output))
6667
```
@@ -86,13 +87,23 @@ M_UNK A_UNK L_UNK W_UNK M_UNK R_UNK L_UNK L_UNK P_UNK L_UNK L_UNK A_UNK L_UNK L_
8687
-----------------------------
8788
ATGGCTTTATGGATGCGTCTGCTGCCGCTGCTGGCGCTGCTGGCGCTGTGGGGCCCGGACCCGGCGGCGGCGTTTGTGAATCAGCACCTGTGCGGCAGCCACCTGGTGGAAGCGCTGTATCTGGTGTGCGGTGAGCGCGGCTTCTTCTACACGCCCAAAACCCGCCGCGAAGCGGAAGATCTGCAGGTGGGCCAGGTGGAGCTGGGCGGCTAA
8889
```
90+
91+
### Generating Multiple Variable Sequences
92+
93+
Set `deterministic=False` to generate variable sequences. Control the variability using `temperature`:
94+
95+
- `temperature`: (recommended between 0.2 and 0.8)
96+
- Lower values (e.g., 0.2): More conservative predictions
97+
- Higher values (e.g., 0.8): More diverse predictions
98+
99+
Using high temperatures might result in prediction of DNA sequences that do not translate to the input protein. <br>
100+
Generate multiple sequences by setting `num_sequences` to a value greater than 1.
89101
<br>
90102

91103
**You can use the [inference template](https://github.com/Adibvafa/CodonTransformer/raw/main/src/CodonTransformer_inference_template.xlsx) for batch inference in [Google Colab](https://adibvafa.github.io/CodonTransformer/GoogleColab).**
92104

93105
<br>
94106

95-
96107
## Installation
97108
Install CodonTransformer via pip:
98109

0 commit comments

Comments
 (0)