Skip to content

Commit 96e110e

Browse files
Update to latest models
1 parent 6c89630 commit 96e110e

File tree

20 files changed

+1092
-93
lines changed

20 files changed

+1092
-93
lines changed
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``cleanrooms``",
4+
"description": "This release adds support for configurable spark properties for Cleanrooms PySpark workloads."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``connect``",
4+
"description": "Fixes in SDK for customers using TestCase APIs"
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``connectcampaignsv2``",
4+
"description": "This release adds support for campaign entry limits configuration and hourly refresh frequency in Amazon Connect Outbound Campaigns."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``groundstation``",
4+
"description": "Adds support for updating contacts, listing antennas, and listing ground station reservations. New API operations - UpdateContact, ListContactVersions, DescribeContactVersion, ListAntennas, and ListGroundStationReservations."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``imagebuilder``",
4+
"description": "ImportDiskImage API adds registerImageOptions for Secure Boot control and custom UEFI data. It adds windowsConfiguration for selecting a specific edition from multi-image .wim files during ISO import."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``neptune``",
4+
"description": "Improving Documentation for Neptune"
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``quicksight``",
4+
"description": "Public release of dashboard customization summary, S3 Tables data source type, Athena cross-account connector, custom sorting for controls, and AI-powered analysis generation."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``sagemaker``",
4+
"description": "Adds support for providing NetworkInterface for efa enabled instances and Simplified cluster creation for Slurm-orchestrated clusters with optional Lifecycle Script (LCS) configuration."
5+
}
Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
{
2+
"type": "api-change",
3+
"category": "``sts``",
4+
"description": "The STS client now supports configuring SigV4a through the auth scheme preference setting. SigV4a uses asymmetric cryptography, enabling customers using long-term IAM credentials to continue making STS API calls even when a region is isolated from the partition leader."
5+
}

awscli/botocore/data/cleanrooms/2022-02-17/service-2.json

Lines changed: 7 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -8175,7 +8175,7 @@
81758175
},
81768176
"memberAbilities":{
81778177
"shape":"MemberAbilities",
8178-
"documentation":"<p>The abilities granted to the collaboration member. These determine what actions the member can perform within the collaboration.</p> <note> <p>The following values are currently not supported: <code>CAN_QUERY</code>, <code>CAN_RECEIVE_RESULTS,</code> and <code>CAN_RUN_JOB</code>. </p> <p>Set the value of <code>memberAbilities</code> to <code>[]</code> to allow a member to contribute data.</p> </note>"
8178+
"documentation":"<p>The abilities granted to the collaboration member. These determine what actions the member can perform within the collaboration.</p> <note> <p>The following values are currently not supported: <code>CAN_QUERY</code> and <code>CAN_RUN_JOB</code>. </p> <p>Set the value of <code>memberAbilities</code> to <code>[]</code> to allow a member to contribute data.</p> <p>Set the value of <code>memberAbilities</code> to <code>[CAN_RECEIVE_RESULTS]</code> to allow a member to contribute data and receive results.</p> </note>"
81798179
},
81808180
"displayName":{
81818181
"shape":"DisplayName",
@@ -9493,14 +9493,18 @@
94939493
"number":{
94949494
"shape":"ProtectedJobWorkerComputeConfigurationNumberInteger",
94959495
"documentation":"<p>The number of workers for a PySpark job.</p>"
9496+
},
9497+
"properties":{
9498+
"shape":"WorkerComputeConfigurationProperties",
9499+
"documentation":"<p>The configuration properties for the worker compute environment. These properties allow you to customize the compute settings for your Clean Rooms workloads.</p>"
94969500
}
94979501
},
94989502
"documentation":"<p>The configuration of the compute resources for a PySpark job.</p>"
94999503
},
95009504
"ProtectedJobWorkerComputeConfigurationNumberInteger":{
95019505
"type":"integer",
95029506
"box":true,
9503-
"max":128,
9507+
"max":1024,
95049508
"min":4
95059509
},
95069510
"ProtectedJobWorkerComputeType":{
@@ -11433,7 +11437,7 @@
1143311437
"members":{
1143411438
"spark":{
1143511439
"shape":"SparkProperties",
11436-
"documentation":"<p>The Spark configuration properties for SQL workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.</p>"
11440+
"documentation":"<p>The Spark configuration properties for SQL and PySpark workloads. This map contains key-value pairs that configure Apache Spark settings to optimize performance for your data processing jobs. You can specify up to 50 Spark properties, with each key being 1-200 characters and each value being 0-500 characters. These properties allow you to adjust compute capacity for large datasets and complex workloads.</p>"
1143711441
}
1143811442
},
1143911443
"documentation":"<p>The configuration properties that define the compute environment settings for workers in Clean Rooms. These properties enable customization of the underlying compute environment to optimize performance for your specific workloads.</p>",

0 commit comments

Comments
 (0)