-
Notifications
You must be signed in to change notification settings - Fork 7.2k
Pull requests: run-llama/llama_index
Author
Label
Projects
Milestones
Reviews
Assignee
Sort
Pull requests list
feat(readers/service-now): add OAuth2 Client Credentials Grant Flow s…
size:XL
This PR changes 500-999 lines, ignoring generated files.
#21308
opened Apr 5, 2026 by
manoj-bhamsagar
Loading…
12 of 15 tasks
feat(llama-index-readers-confluence): inject customizable HTML page parser and bump to 0.7.1
size:L
This PR changes 100-499 lines, ignoring generated files.
fix(llms-ollama): correct streaming chunk handling in stream_chat and astream_chat
size:L
This PR changes 100-499 lines, ignoring generated files.
#21303
opened Apr 4, 2026 by
balgaly
Loading…
feat: add VerificationQueryEngine component (#21213)
size:M
This PR changes 30-99 lines, ignoring generated files.
#21302
opened Apr 4, 2026 by
DYNOSuprovo
Loading…
7 of 18 tasks
Preserve cache writes from multiprocessing workers in IngestionPipeline
size:L
This PR changes 100-499 lines, ignoring generated files.
#21301
opened Apr 4, 2026 by
gautamvarmadatla
Loading…
10 of 15 tasks
chore(deps): bump the uv group across 8 directories with 12 updates
dependencies
This PR changes a pyproject.toml or a poetry.lock file
python:uv
Pull requests that update python:uv code
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21299
opened Apr 3, 2026 by
dependabot
bot
Loading…
fix: correct typos, duplicate words and grammar across integrations
size:S
This PR changes 10-29 lines, ignoring generated files.
#21297
opened Apr 3, 2026 by
Ricardo-M-L
Loading…
1 task
fix: add missing f-string prefixes in moorcheh and dashscope integrations
size:S
This PR changes 10-29 lines, ignoring generated files.
#21296
opened Apr 3, 2026 by
Ricardo-M-L
Loading…
1 task
fix: correct docstring typos in integration packages
size:S
This PR changes 10-29 lines, ignoring generated files.
#21289
opened Apr 3, 2026 by
Ricardo-M-L
Loading…
1 task
feat: add HeaderAwareMarkdownSplitter node parser
size:XL
This PR changes 500-999 lines, ignoring generated files.
#21281
opened Apr 3, 2026 by
shivam2407
Loading…
2 tasks done
fix(s3-vector-store): warn when _node_content exceeds S3 filterable metadata limit
size:M
This PR changes 30-99 lines, ignoring generated files.
#21279
opened Apr 3, 2026 by
octo-patch
Loading…
fix: correct typo 'similarites' in knowledge graph retriever
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21274
opened Apr 2, 2026 by
Ricardo-M-L
Loading…
fix: bump llama-index-llms-cerebras openai-like dependency to >=0.7.0
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21272
opened Apr 2, 2026 by
hashwnath
Loading…
fix(mcp): handle valid MCP This PR changes 500-999 lines, ignoring generated files.
ContentBlock variants in get_prompt()
size:XL
#21271
opened Apr 2, 2026 by
gautamvarmadatla
Loading…
11 of 15 tasks
docs: add Hindsight memory integration
size:L
This PR changes 100-499 lines, ignoring generated files.
#21267
opened Apr 2, 2026 by
DK09876
Loading…
docs: cleanup docstrings and fix typos in base_agent.py
lgtm
This PR has been approved by a maintainer
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21249
opened Apr 1, 2026 by
Edge-Explorer
Loading…
10 of 12 tasks
Fix synchronous streaming in SiliconFlow LLM integration
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21248
opened Apr 1, 2026 by
xueshanlinghu
Loading…
feat(openai-like): add OpenAILikeResponses class for Responses API
size:L
This PR changes 100-499 lines, ignoring generated files.
#21246
opened Apr 1, 2026 by
Ritikamuruganandam06
Loading…
9 of 10 tasks
Add MLflow AI Gateway LLM integration example
size:L
This PR changes 100-499 lines, ignoring generated files.
fix: preserve thinking-only chunks in Ollama streaming
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21234
opened Mar 31, 2026 by
joaquinhuigomez
Loading…
Add protected_params and DynamicValue to FunctionTool
size:L
This PR changes 100-499 lines, ignoring generated files.
#21228
opened Mar 31, 2026 by
dgenio
Loading…
10 tasks done
fix(callbacks): Ensure TokenCountingHandler tracks embedding tokens in ingestion pipelines
size:M
This PR changes 30-99 lines, ignoring generated files.
#21222
opened Mar 30, 2026 by
manishrawal95
Loading…
fix(llms-openai): fall back to This PR changes 10-29 lines, ignoring generated files.
reasoning field for vLLM compatibility
size:S
#21220
opened Mar 30, 2026 by
IgnazioDS
Loading…
3 tasks
fix: img_2_b64 returns bytes instead of str due to cast bypass
size:XS
This PR changes 0-9 lines, ignoring generated files.
#21209
opened Mar 29, 2026 by
guoyangzhen
Loading…
fix: add opt-in fallback_to_llm param for empty retrieval in CondensePlusContextChatEngine
size:L
This PR changes 100-499 lines, ignoring generated files.
#21206
opened Mar 28, 2026 by
IgnazioDS
Loading…
4 tasks
Previous Next
ProTip!
Find all pull requests that aren't related to any open issues with -linked:issue.