|
| 1 | +<!-- |
| 2 | + # Copyright 2023–2026 Google LLC |
| 3 | +# |
| 4 | +# Licensed under the Apache License, Version 2.0 (the "License"); |
| 5 | +# you may not use this file except in compliance with the License. |
| 6 | +# You may obtain a copy of the License at |
| 7 | +# |
| 8 | +# https://www.apache.org/licenses/LICENSE-2.0 |
| 9 | +# |
| 10 | +# Unless required by applicable law or agreed to in writing, software |
| 11 | +# distributed under the License is distributed on an "AS IS" BASIS, |
| 12 | +# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied. |
| 13 | +# See the License for the specific language governing permissions and |
| 14 | +# limitations under the License. |
| 15 | + --> |
| 16 | + |
| 17 | +# src/MaxText |
| 18 | + |
| 19 | +The contents of `src/MaxText` have moved to `src/maxtext` as part of a larger |
| 20 | +[restructuring effort in MaxText](https://github.com/AI-Hypercomputer/maxtext/blob/2790ed289c0c4cb704645d5d2ab91da26711b891/RESTRUCTURE.md). |
| 21 | +This directory only contains shim files to temporarily support legacy commands like `python3 -m MaxText.train ...`. |
| 22 | +These legacy commands are now deprecated and will be removed soon. Please migrate your existing commands and avoid using |
| 23 | +legacy ones. The new command locations can be found at: |
| 24 | + |
| 25 | +* `MaxText.decode` → `maxtext.inference.decode` |
| 26 | +* `MaxText.distillation.train_distill` → `maxtext.trainers.post_train.distillation.train_distill` |
| 27 | +* `MaxText.maxengine_server` → `maxtext.inference.maxengine.maxengine_server` |
| 28 | +* `MaxText.rl.evaluate_rl` → `maxtext.trainers.post_train.rl.evaluate_rl` |
| 29 | +* `MaxText.rl.train_rl` → `maxtext.trainers.post_train.rl.train_rl` |
| 30 | +* `MaxText.sft.sft_trainer` → `maxtext.trainers.post_train.sft.train_sft` |
| 31 | +* `MaxText.train` → `maxtext.trainers.pre_train.train` |
| 32 | +* `MaxText.train_compile` → `maxtext.trainers.pre_train.train_compile` |
| 33 | +* `MaxText.train_tokenizer` → `maxtext.trainers.tokenizer.train_tokenizer` |
0 commit comments