AWS Bedrock

Setting up a Bedrock connection

AWS Bedrock provides a variety of foundation models on demand. You can connect your AWS Bedrock account to OmniAI to run models within your AWS ecosystem.

Prerequisites

Regions

Latest region support is available in the official Bedrock Documentation.

  • us-east-1

  • us-west-2

  • eu-west-3

  • ap-southeast-2

Connecting OmniAI to Bedrock

From the OmniAI settings page, you need to authenticate the connection. This can be done either by using an IAM User (with AWS Access Key ID and Secret Access Key) or an IAM Role (with Role ARN).

Using an IAM User

To provision AWS access keys, follow the official IAM Documentation.

  • Access Key ID

  • Secret Access Key

  • Region

At a minimum, the the IAM user must have the following roles enabled:

{
    "Version": "2012-10-17",
    "Statement": [
	{
	    "Effect": "Allow",
	    "Action": [
		"bedrock:InvokeModel"
	    ],
	    "Resource": "arn:aws:bedrock:*:*:model/*"
	}
    ]
}

Using an IAM Role

Open your AWS Console and browse to the IAM service. Click Roles and Create role.

  • When creating the role, choose AWS Account for Trusted Entity Type.

  • If using Omni on your VPC, you will leave This account selected. When using the Omni Cloud product, you will want to select Another AWS account. Provide Omni's AWS Account ID: 851725384009.

  • Check the Require external ID checkbox and enter your External ID. You will find your External ID in the OmniAI settings page.

  • Add the AmazonBedrockFullAccess permission to the role.

  • When done, click on your role and copy its ARN. Go back to OmniAI and enter the role ARN.

  • Click Connect.

Creating a Bedrock User

You can view the full AWS Bedrock documentation at aws.amazon.com/bedrock. The following steps will cover setting up an IAM role with Bedrock and Claude Access.

1. Visit the AWS Bedrock

Navigate to console.aws.amazon.com/bedrock. From here navigate to Model Access.

2. Configure Model Access

From this view, select Manage Model Access in the top right. Then select the Anthropic family of Models to request access. Depending on your account status, you may be asked to complete use-case details prior to requesting access.

Remember to Save Changes after selecting the model choice.

3. Verifying Access

You should see the model status as Access Granted once the request is approved. This typically takes no more than 1 hour to process. You should also receive an email notification on approval.

Connecting via the API

You can create and test your AWS Bedrock connection with the save-model-provider endpoint. This will establish a connection, as well as verify the models you can access.

POST /save-model-provider

Headers

An API Key is required to access this endpoint.

Name
Value

x-api-key

your_api_key

Body

Name
Type
Description

type

string

Model provider ("BEDROCK")

config

Config object

Config

Name
Type
Description

awsAccessKeyId

string

AWS Access Key

awsRegion

string

AWS Region (us-east-1)

awsSecretAccessKey

string

AWS Secret Access Key

Example Request

curl --location 'https://api.getomni.ai/save-model-provider' \
--header 'x-api-key: a0000-b0000-c000-d000-e0000000000' \
--header 'Content-Type: application/json' \
--data '{
    "config": {
        "awsAccessKeyId": "accessKey",
        "awsRegion": "us-east-1",
        "awsSecretAccessKey": "secretKey"
    },
    "type": "BEDROCK"
}'

Response

The API will return a success boolean, as well as an array of supported models. The models array will only show models that are supported on OmniAI, and may not include all models available via AWS Bedrock.

{
    "success": true,
    "models": [
        "anthropic.claude-3-haiku-20240307-v1:0"
    ]
}

Last updated