AWS Bedrock with aws-lite

[This will likely be outdated as AWS releases updates to Bedrock, it's SDK, and the subsequent work on an aws-lite plugin for Bedrock.]

AWS Bedrock is AWS's managed AI service that lets users access common models like Amazon Titan and those from other orgs like Meta's Llama and Stability AI's StableDiffusion.

aws-lite is a minimal AWS API client. It's particularly lightweight and fast when compared to the official SDKs. (I contribute to aws-lite as a part of my open source work on Architect which is included in my job description at Begin.)

Typically a developer would use aws-lite in combination with at least one of the many available plugins. DynamoDB, S3, Lambda, etc are supported by plugins. However, there isn't (yet) a plugin for Bedrock. In the meantime, we can use the bare @aws-lite/client to create authenticated requests to any service API at AWS.

npm i @aws-lite/client

Requests will automatically use my AWS credentials found in ~/.aws/credentials.

Note: you will need to have access granted to specific Bedrock models on your own AWS account. This is fairly quick, but does require some AWS console spelunking.

Let's start with the basics, set up aws-lite and get a list of Meta models from Bedrock:

import AwsLite from '@aws-lite/client'
const PROVIDER = 'meta'

const aws = await AwsLite({
  region: 'us-east-1',
  autoloadPlugins: false,
  verifyService: false,
  // debug: true,
})

// https://docs.aws.amazon.com/bedrock/latest/APIReference/API_ListFoundationModels.html
const ListFoundationModels = await aws({
  service: 'bedrock',
  path: '/foundation-models',
  query: { byProvider: PROVIDER },
})
const { modelSummaries: models } = ListFoundationModels.payload

console.log(models.length, `"${PROVIDER}" models found on Bedrock`)

As of today, this prints

8 "meta" models found on Bedrock

Next, we can invoke one of these models with a known Id using a hardcoded question:

const LLAMA_ID = 'meta.llama2-13b-chat-v1'
const myLlama = models.find(({modelId}) => modelId === LLAMA_ID)

console.log(`Using "${myLlama.modelName}"`) // Using "Llama 2 Chat 13B"

const PROMPT = 'What is an axolotl?'
console.log(`Question: "${PROMPT}"`) // Question: "What is an axolotl?"

// https://docs.aws.amazon.com/bedrock/latest/APIReference/API_runtime_InvokeModel.html
const InvokeModel = await aws({
  service: 'bedrock',
  host: 'bedrock-runtime.us-east-1.amazonaws.com', 
  // Note: the host differs from the service name, so we provide the full value
  path: `/model/${myLlama.modelId}/invoke`,
  // https://docs.aws.amazon.com/bedrock/latest/userguide/model-parameters-meta.html
  payload: {
    prompt: PROMPT,
    temperature: 0.2,
    top_p: 0.9,
    max_gen_len: 512,
  },
})
const { generation } = InvokeModel.payload

console.log(generation.trim()) // the answer describing axolotls!

That's it!

> node index.js
8 "meta" models found on Bedrock
Using "Llama 2 Chat 13B"...

Question: "What is an axolotl?"
An axolotl (Ambystoma mexicanum) is a type of salamander that
never undergoes metamorphosis. Instead, it remains in its 
larval stage throughout its life, which can range from 10 to 
15 years in captivity. Axolotls are native to Mexico and are 
found in freshwater lakes, canals, and drainage ditches. They 
are known for their unique ability to regrow their limbs, 
eyes, and parts of their brains, making them a popular subject 
for scientific research.

[... lengthy answer redacted]

Spent 414 (prompt: 9 + generation: 405) tokens

I think next I will set this up as a small CLI to allow me to choose other models and enter a prompt from the command line.

from Sanity.io (109.73ms) to HTML (1.64ms)