Skip to content

MOSH-2174: Expose gradient_accumulation_steps in the API#363

Merged
blainekasten merged 4 commits into
mainfrom
nikita/gas_expose
May 11, 2026
Merged

MOSH-2174: Expose gradient_accumulation_steps in the API#363
blainekasten merged 4 commits into
mainfrom
nikita/gas_expose

Conversation

@nikita-smetanin
Copy link
Copy Markdown
Contributor

Summary

  • Add gradient_accumulation_steps field to FinetuneRequest pydantic model
  • Add parameter to sync and async create() methods in FineTuningResource
  • Thread through create_finetune_request() builder function

Test plan

  • CI passes
  • client.fine_tuning.create(..., gradient_accumulation_steps=4) works

🤖 Generated with Claude Code

- Add gradient_accumulation_steps field to FinetuneRequest model
- Add parameter to sync and async create() methods
- Thread through create_finetune_request() function

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
@nikita-smetanin nikita-smetanin marked this pull request as ready for review May 11, 2026 14:06
@mintlify
Copy link
Copy Markdown

mintlify Bot commented May 11, 2026

Docs PR opened: https://github.com/togethercomputer/mintlify-docs/pull/815

Documented the new gradient_accumulation_steps parameter in the OpenAPI spec (request and response schemas) and in the fine-tuning quickstart.

Co-Authored-By: Claude Opus 4.6 (1M context) <noreply@anthropic.com>
Comment thread src/together/lib/cli/api/fine_tuning/create.py Outdated
Comment thread src/together/lib/cli/api/fine_tuning/create.py Outdated
@blainekasten blainekasten merged commit 9b32b95 into main May 11, 2026
10 checks passed
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants