Skip to content

fix(iam): add bedrock application-inference-profile to lambda executi…#236

Merged
rstrahan merged 3 commits intoaws-solutions-library-samples:developfrom
sebastianpuertosanc:patch-1
Mar 11, 2026
Merged

fix(iam): add bedrock application-inference-profile to lambda executi…#236
rstrahan merged 3 commits intoaws-solutions-library-samples:developfrom
sebastianpuertosanc:patch-1

Conversation

@sebastianpuertosanc
Copy link
Copy Markdown
Contributor

…on roles

What changed:
Added the application-inference-profile/* ARN pattern to the bedrock:InvokeModel and bedrock:InvokeModelWithResponseStream IAM policy statements across the document processing Lambda functions (Classification, Extraction, Assessment, and Summarization).

Why it matters:
Previously, the IAM policies in the SAM templates only permitted invocation of standard Foundation Models (foundation-model/*) and cross-region inference profiles (inference-profile/*). This strict string matching blocked the use of custom Application Inference Profiles (application-inference-profile/*).

By adding this ARN pattern, users can now successfully map custom inference profiles (like Nova 2 Lite) to the IDP pipeline. This unlocks the ability to:

  • Tag Bedrock invocations for granular cost allocation.
  • Track custom model throughput and performance metrics.
  • Avoid AccessDeniedException errors when substituting default models with application-specific profiles.

Related Issue(s):
#235

Issue #, if available:

Description of changes:

By submitting this pull request, I confirm that you can use, modify, copy, and redistribute this contribution, under the terms of your choice.

rstrahan and others added 3 commits March 6, 2026 18:32
…on roles

**What changed:**
Added the `application-inference-profile/*` ARN pattern to the `bedrock:InvokeModel` and `bedrock:InvokeModelWithResponseStream` IAM policy statements across the document processing Lambda functions (Classification, Extraction, Assessment, and Summarization). 

**Why it matters:**
Previously, the IAM policies in the SAM templates only permitted invocation of standard Foundation Models (`foundation-model/*`) and cross-region inference profiles (`inference-profile/*`). This strict string matching blocked the use of custom Application Inference Profiles (`application-inference-profile/*`). 

By adding this ARN pattern, users can now successfully map custom inference profiles (like Nova 2 Lite) to the IDP pipeline. This unlocks the ability to:
* Tag Bedrock invocations for granular cost allocation.
* Track custom model throughput and performance metrics.
* Avoid `AccessDeniedException` errors when substituting default models with application-specific profiles.

**Related Issue(s):**
aws-solutions-library-samples#235
@rstrahan rstrahan changed the base branch from main to develop March 11, 2026 13:31
@rstrahan rstrahan merged commit ccd214a into aws-solutions-library-samples:develop Mar 11, 2026
1 check failed
@rstrahan
Copy link
Copy Markdown
Contributor

Thanks so much @sebastianpuertosanc - will be in v0.5.2

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants