Amazon Bedrock

Amazon Bedrock Service is a fully managed service that offers a choice of high-performing foundation models (FMs) from leading AI companies like AI21 Labs, Anthropic, Cohere, Meta, Mistral AI, Stability AI, and Amazon through a single API, along with a broad set of capabilities you need to build generative AI applications with security, privacy, and responsible AI. Using Amazon Bedrock, you can easily experiment with and evaluate top FMs for your use case, privately customize them with your data using techniques such as fine-tuning and Retrieval Augmented Generation (RAG), and build agents that execute tasks using your enterprise systems and data sources. Since Amazon Bedrock is serverless, you don't have to manage any infrastructure, and you can securely integrate and deploy generative AI capabilities into your applications using the AWS services you are already familiar with.
The Sumo Logic Amazon Bedrock app dashboards offer insights into CloudTrail, CloudWatch Logs, and performance metrics for your Amazon Bedrock service. These preconfigured dashboards enable you to monitor logs and the runtime performance metrics of your Amazon Bedrock.
Log and metrics types
The Amazon Bedrock app uses the following logs and metrics:
- Monitor Amazon Bedrock API calls using CloudTrail.
- Monitor model invocation using CloudWatch Logs.
- Amazon Bedrock runtime metrics.
Sample CloudTrail log message
Click to expand
Sample CloudWatch logs
Click to expand
Sample queries
account=* region=us-east-1 namespace=aws/bedrock "\"eventSource\":\"bedrock.amazonaws.com\"" !errorCode
| json "eventSource", "eventName", "eventType", "sourceIPAddress", "errorCode", "errorMessage" nodrop
| json "userIdentity.type", "userIdentity.userName", "userIdentity.arn", "recipientAccountId", "awsRegion" as user_type, user_name, arn, accountid, region nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| json "requestParameters.modelId", "responseElements.modelId" as reqModelid, resmodelId nodrop
| if (!isBlank(reqModelid), reqModelid, resmodelId) as modelid
| where eventSource matches "bedrock.amazonaws.com"
| where modelid matches "ai21.j2-mid-v1" or isBlank(modelid)
| count as eventCount by sourceIPAddress
| lookup latitude, longitude from geo://location on ip=sourceIPAddress
account=* region=us-east-1 namespace=aws/bedrock "\"eventSource\":\"bedrock.amazonaws.com\"" errorCode
| json "eventSource", "eventName", "eventType", "sourceIPAddress", "errorCode", "errorMessage" nodrop
| json "userIdentity.type", "userIdentity.userName", "userIdentity.arn", "recipientAccountId", "awsRegion" as user_type, user_name, arn, accountid, region nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| json "requestParameters.modelId", "responseElements.modelId" as reqModelid, resmodelId nodrop
| if (!isBlank(reqModelid), reqModelid, resmodelId) as modelid
| where eventSource matches "bedrock.amazonaws.com"
| where modelid matches "ai21.j2-mid-v1" or isBlank(modelid)
| count as eventCount by errorMessage
| sort by eventCount, errorMessage asc
account=* region=us-east-1 namespace=aws/bedrock "\"eventSource\":\"bedrock.amazonaws.com\""
| json "eventSource", "eventName", "eventType", "sourceIPAddress", "errorCode", "errorMessage" nodrop
| json "userIdentity.type", "userIdentity.userName", "userIdentity.arn", "recipientAccountId", "awsRegion" as user_type, user_name, arn, accountid, region nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| json "requestParameters.modelId", "responseElements.modelId" as reqModelid, resmodelId nodrop
| if (!isBlank(reqModelid), reqModelid, resmodelId) as modelid
| where eventSource matches "bedrock.amazonaws.com"
| where modelid matches "ai21.j2-mid-v1" or isBlank(modelid)
| where !(eventName matches "Get*") and !(eventName matches "List*")
| count as eventCount by eventName
| sort by eventCount, eventName asc
| limit 20
account=* region=* namespace=aws/bedrock
| json "accountId", "region", "operation", "identity.arn", "modelId" as accountid, region, operation, arn, modelid nodrop
| parse field=arn "arn:aws:*::*:user/*" as user_type, f1, user_name nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| where accountid matches "*" and operation matches "*" and user_name matches "*" and modelid matches "*"
| count as events by accountid, region, operation, user_type, user_name, modelid
| sort by events, accountid asc, region asc, operation asc, user_type asc, user_name asc, modelid asc
account=* region=* namespace=aws/bedrock
| json "accountId", "region", "operation", "identity.arn", "modelId" as accountid, region, operation, arn, modelid nodrop
| parse field=arn "arn:aws:*::*:user/*" as user_type, f1, user_name nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| where accountid matches "*" and operation matches "*" and user_name matches "*" and modelid matches "*"
| timeslice 1h
| count by _timeslice, operation
| transpose row _timeslice column operation
account=* region=* namespace=aws/bedrock
| json "accountId", "region", "operation", "identity.arn", "modelId" as accountid, region, operation, arn, modelid nodrop
| parse field=arn "arn:aws:*::*:user/*" as user_type, f1, user_name nodrop
| parse field=arn "arn:aws:sts::*:*/*" as f1, user_type, user_name nodrop
| where accountid matches "*" and operation matches "*" and user_name matches "*" and modelid matches "*"
| timeslice 1h
| count by _timeslice, modelid
| transpose row _timeslice column modelid
account=* region=* namespace=aws/bedrock modelid=* metric=InvocationLatency statistic=average | avg by modelid
account=* region=* namespace=aws/bedrock modelid=* metric=Invocations statistic= sum | quantize using sum | sum by modelid
Collecting logs and metrics for the Amazon Bedrock app
Collect CloudWatch Metrics
Sumo Logic supports collecting metrics using two source types:
-
Configure an AWS Kinesis Firehose for Metrics Source (Recommended); or
-
Configure an Amazon CloudWatch Source for Metrics
-
Namespace for Amazon Bedrock Service is AWS/Bedrock.
- Metadata. Add an account field to the source and assign it a value that is a friendly name/alias to your AWS account from which you are collecting metrics. Metrics can be queried via the “account field”.
Collect Amazon Bedrock CloudTrail logs
- Add an AWS CloudTrail Source to your Hosted Collector.
- Name. Enter a name to display the new Source.
- Description. Enter an optional description.
- S3 Region. Select the Amazon Region for your Amazon Bedrock S3 bucket.
- Bucket Name. Enter the exact name of your Amazon Bedrock S3 bucket.
- Path Expression. Enter the string that matches the S3 objects you'd like to collect. You can use a wildcard (*) in this string. (DO NOT use a leading forward slash. See Amazon Path Expressions). The S3 bucket name is not part of the path. Don’t include the bucket name when you are setting the Path Expression.
- Source Category. Enter
aws/observability/cloudtrail/logs
. - Fields. Add an account field and assign it a value that is a friendly name/alias to your AWS account from which you are collecting logs. Logs can be queried via the “account field”.
- Access Key ID and Secret Access Key. Enter your Amazon Access Key ID and Secret Access Key. Learn how to use Role-based access to AWS here.
- Log File Discovery > Scan Interval. Use the default of 5 minutes. Alternately, enter the frequency. Sumo Logic will scan your S3 bucket for new data. Learn how to configure Log File Discovery here.
- Enable Timestamp Parsing. Select the Extract timestamp information from log file entries check box.
- Time Zone. Select Ignore time zone from the log file and instead use, and select UTC from the dropdown.
- Timestamp Format. Select Automatically detect the format.
- Enable Multiline Processing. Select the Detect messages spanning multiple lines check box, and select Infer Boundaries.
- Click Save.
Collect Amazon Bedrock CloudWatch logs
To enable Amazon Bedrock CloudWatch Logs, follow the steps mentioned in AWS Documentation
Ensure that when configuring CloudWatch Logs
, the log group name follows the pattern /aws/bedrock/*
.

Sumo Logic supports several methods for collecting logs from Amazon CloudWatch. You can choose either of them to collect logs:
-
AWS Kinesis Firehose for Logs. Configure an AWS Kinesis Firehose for Logs (Recommended); or
-
Lambda Log Forwarder. Configure a collection of Amazon CloudWatch Logs using our AWS Lambda function using a Sumo Logic provided CloudFormation template, as described in Amazon CloudWatch Logs or configure collection without using CloudFormation, see Collect Amazon CloudWatch Logs using a Lambda Function.
-
While configuring the CloudWatch log source, following fields can be added in the source:
- Add an account field and assign it a value which is a friendly name/alias to your AWS account from which you are collecting logs. Logs can be queried via the account field.
- Add a region field and assign it the value of the respective AWS region where the Bedrock exists.
- Add an accountId field and assign it the value of the respective AWS account id which is being used.
Field in Field Schema
- Classic UI. In the main Sumo Logic menu, select Manage Data > Logs > Fields.
New UI. In the top menu select Configuration, and then under Logs select Fields. You can also click the Go To... menu at the top of the screen and select Fields. - Search for the
modelId
field. - If not present, create it. Learn how to create and manage fields here.
Field Extraction Rule(s)
Create a Field Extraction Rule for CloudTrail Logs. Learn how to create a Field Extraction Rule here.
Rule Name: AwsObservabilityBedrockCloudTrailLogsFER
Applied at: Ingest Time
Scope (Specific Data): account=* eventname eventsource "bedrock.amazonaws.com"
json "eventSource", "awsRegion", "recipientAccountId" as event_source, region, accountid nodrop
| where event_source matches "bedrock.amazonaws.com"
| "aws/bedrock" as namespace
| json "requestParameters.modelId", "responseElements.modelId" as reqModelid, resmodelId nodrop
| if (!isBlank(reqModelid), reqModelid, resmodelId) as modelId
| fields accountid, region, namespace, modelId
Create/Update Field Extraction Rule(s) for Bedrock CloudWatch logs
Rule Name: AwsObservabilityBedrockCloudWatchLogsFER
Applied at: Ingest Time
Scope (Specific Data):
account=* region=* _sourceHost=/aws/bedrock/*
if (isEmpty(namespace),"unknown",namespace) as namespace
| if (_sourceHost matches "/aws/bedrock/*", "aws/bedrock", namespace) as namespace
| json "modelId" as modelId nodrop
| tolowercase(modelId) as modelId
| fields namespace, modelId
Centralized AWS CloudTrail log collection
In case you have a centralized collection of CloudTrail logs and are ingesting them from all accounts into a single Sumo Logic CloudTrail log source, create the following Field Extraction Rule to map a proper AWS account(s) friendly name/alias. Create it if not already present / update it as required.
Rule Name: AWS Accounts
Applied at: Ingest Time
Scope (Specific Data): _sourceCategory=aws/observability/cloudtrail/logs
Parse Expression:
Enter a parse expression to create an “account” field that maps to the alias you set for each sub account. For example, if you used the “dev”
alias for an AWS account with ID "956882123456"
and the “prod”
alias for an AWS account with ID "567680881046"
, your parse expression would look like:
| json "recipientAccountId"
// Manually map your aws account id with the AWS account alias you setup earlier for individual child account
| "" as account
| if (recipientAccountId = "956882123456", "dev", account) as account
| if (recipientAccountId = "567680881046", "prod", account) as account
| fields account
Installing the Bedrock app
Now that you have set up a collection for Amazon Bedrock, install the Sumo Logic app to use the pre-configured dashboards that provide visibility into your environment for real-time analysis of overall usage.
To install the app, do the following:
- Select App Catalog.
- In the 🔎 Search Apps field, run a search for your desired app, then select it.
- Click Install App.
note
Sometimes this button says Add Integration.
- Click Next in the Setup Data section.
- In the Configure section of your respective app, complete the following fields.
- Key. Select either of these options for the data source.
- Choose Source Category and select a source category from the list for Default Value.
- Choose Custom, and enter a custom metadata field. Insert its value in Default Value.
- Key. Select either of these options for the data source.
- Click Next. You will be redirected to the Preview & Done section.
Post-installation
Once your app is installed, it will appear in your Installed Apps folder, and dashboard panels will start to fill automatically.
Each panel slowly fills with data matching the time range query and received since the panel was created. Results will not immediately be available, but will update with full graphs and charts over time.
Viewing the Bedrock dashboards
We highly recommend you view these dashboards in the AWS Observability view of the AWS Observability solution.
Overview
The Amazon Bedrock - Overview dashboard provides a overall heath of Bedrock service based logs and metrics.
Use this dashboard to:
- Monitor locations of successful and failed Amazon Bedrock user activity events.
- Monitor all read-only and non-read-only events.
- Monitor most active users working on Bedrock infrastructure and various events invoked on Bedrock service.

CloudTrail Audit Overview
The Amazon Bedrock - CloudTrail Audit Overview dashboard provides a record of actions taken by a user, role, or an AWS service in Amazon Bedrock. CloudTrail captures all API calls for Amazon Bedrock as events.
Use this dashboard to:
- Monitor Amazon Bedrock-related audit logs using CloudTrail Events.
- Monitor locations of successful and failed Amazon Bedrock user activity events.
- Monitor all read-only and non-read-only events.
- Monitor most active users working on Bedrock infrastructure and various events invoked on Bedrock service.

Model Invocation Log Analysis
The Amazon Bedrock - Model Invocation Log Analysis dashboard provides insights into audit events of your invocation logs, model input data, and model output data for all invocations in your AWS account used in Amazon Bedrock.
Use this dashboard to:
- Monitor Amazon Bedrock-related audit logs using CloudWatch Events.
- Monitor operational events and the models being utilized.
- Monitor most active users working on Bedrock service.

Runtime Performance Monitoring
The Amazon Bedrock - Runtime Performance Monitoring dashboard provides statistical insights of runtime model invocation metrics.
Use this dashboard to:
- Monitor all Invocations related metrics.
- Monitor and track of input and output tokens.
- Monitor and track images in the output.
