This article demonstrates how to build a Federal Risk and Authorization Management Program (FedRAMP) compliant generative artificial intelligence (AI) application using Amazon Aurora PostgreSQL-Compatible Edition and Amazon Bedrock. The solution leverages Retrieval Augmented Generation (RAG) to enhance user queries and responses by augmenting the training of a large language model (LLM) with data from a company’s internal business systems. The application uses Amazon Aurora Machine Learning to create machine learning (ML) or generative AI applications using familiar SQL programming, and Amazon Bedrock to offer a choice of high-performing foundation models (FMs) from leading AI companies.
The solution architecture involves ingesting documents from Amazon S3 into an Aurora PostgreSQL table, creating embeddings with SQL functions, and running a chatbot query to generate responses. The application is built using Streamlit and can be deployed in three main steps. The article provides a step-by-step guide on how to set up an Aurora PostgreSQL cluster, ingest documents, create embeddings, and run a chatbot query to generate responses.











