You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: CONTRIBUTING.md
+18Lines changed: 18 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -107,6 +107,24 @@ python3 -m venv venv
107
107
poetry install --with dev
108
108
```
109
109
110
+
111
+
> Note: After installing, if you wish to use [FlashAttention](https://github.com/Dao-AILab/flash-attention), then you need to install these requirements:
112
+
113
+
```
114
+
poetry install --with dev,flash-attn
115
+
```
116
+
117
+
If you wish to use [aim](https://github.com/aimhubio/aim), then you need to install it:
118
+
```
119
+
poetry install --with aim
120
+
```
121
+
122
+
If you wish to use [fms-acceleration](https://github.com/foundation-model-stack/fms-acceleration), you need to install it.
123
+
```
124
+
poetry install --with fms-accel
125
+
```
126
+
`fms-acceleration` is a collection of plugins that packages that accelerate fine-tuning / training of large models, as part of the `fms-hf-tuning` suite. For more details on see [this section below](#fms-acceleration).
0 commit comments