Create limited limited use of AWA-based AWS using Amazon Bedrock

The applications of the Artificial Intelligence (AI) is generally made using a process called retrieving generation recovery (FM) that provides additional access to additional information during training. This data is used to enrich the productive production of AI ai prep bringing special and accurate answers without redirecting FM, while improving the clarification and reducing hallucinations.
In this post, we are using the solution we use the Amazon EXCO (EXS) service for the Amazon Bedrock.
Amazon Ex proves wide, secure and efficiently expensive to build a RAG. Improves Working on well-prepared computer conditions, auto-scales GPU luggage for AWS at EC2 Photo Stances and AWS Fargate and provides Enterprise Enterpreter in environmental means of AWS and AWS IAM.
Our solution uses amazon s3 as a random data source and filling Amazon Openseach Server Vatabase using Amazon Bedrock Bases with user files and folders and related files and related files and related files and metadata and related associations and related files and metadata and related associations and related folder and related files and metadata. This enables Rag Scenario with Amazon Bedrock for enriching the Generative Ai Prompt using Amazon Bedrock API for the Dataarch Serdor Vector Dataless.
Looking for everything
The solution uses Amazon Ex responsible for exchanging the provision of the provisional provisional service delivery areas (Amazon EC2) of the Amazon EC2 of the Amazon Eks Cluster. Every mobile node in Cluster was provided in a part of the Amazon EC2 auto scaluling team prepared for you in Ek.
The CLUSTE contains the SUGHTA SUCCESS TO THE TWO AVAILABLE IN TRANSLATIONS AT EACH POSITION AT THE MOVED BEDROCK RAG epislirty (ECR). This setup guarantees that resources are used properly, high or down in terms of need. Olontal Pod autoscaler (HPA) is set above receiving the amount of pods value in our management based on their CPU implementation.
The Rag RaTrieval application container uses the basics of the Bedrock Information. The solution provides the final user with a powerful degree to access the RAG function using the Bernetes service in front of Amazon application for the Expenda.
RAG RAGRAVAL ORCTISTRATED USS APPLICATION WITH EVAG RAG HAPPHON BEDROCK AFTER AVE IN AVE IN THE APPENCY EDPOIDED INDEX AVAILABLE IN THE DEVER INFERED IN THE DEVER OFFERED IN AFEX OFFERED IN THE DEVER INFERED IN THE DEVER INFERED IN THE DEVER OFFERED IN THE DEVER INFERED IN THE DEVER INFERED IN THE DEVER OFFERED IN THE DEVER OFFERED IN THE DEVER INFERED IN THE DEVER OFFERED IN THE DEVER OFFERED IN ADEVER INFERED IN THE DEVER INFERED IN THE DEVER INFERED IN THE DEVER OFFERED IN AVEFA.
The next Architecture drawing shows different components of our solution:
Requirements
Complete the following requirements:
- Confirm access to the model in Amazon Bedrock. In this solution, we use anthropic's Claude 3.5 Sonnet in Amazon Bedrock.
- Enter the AWS Command Line interface (AWS CLI).
- Enter Docker.
- Enter the iBetl.
- Enter Terradorm.
Use a solution
The solution is available to be related in the GitHub Repo. Carrying and Terreform template will be provided with their required configuration:
- Clone Repository Git:
- From the
terraformFolder, Set a solution using Terreform:
Prepare EX
- Prepare a secret of ECR Registry:
- Navigate to
kubernetes/ingressFolder:- Make sure that
AWS_Regionvariable tobedrockragconfigmap.yamlFile points in your AWS region. - Replace the image URI in the 20-line of
bedrockragdeployment.yamlFile with your URIbedrockragimage from your ECR area.
- Make sure that
- Providing Exi-Shipment of EX, Service and Ingress:
Create Information Basis and Upload Data
To create a basis for information and upload data, follow these steps:
- Create S3 bucket and upload your data to the bucket. In our blog post, we downloaded the two files, the Amazon Bedrock Guide and Amazon FSX of the Ontap user guide, in our S3 bucket.
- Create Amazon Bedrock Information Base. Follow the steps here to create a navigation background. Accept all defaults including use Prompt to create a new Vector store option In the stairs 7 of the command created by Amazon Opensech SEARVESS Research a group of collection as your information support.
- Assisted 5C commands to create a basis for information, provide S3 Uri site which contains files with the data source of information
- Where the basis of information is provided, find ID for the information center From the bed of the Bedrock Knowledge Base Console on your newly created information.
Question using responsibility for Lancer
You can ask the model directly using the API fate provided by the AWS Albis provided by the Bernetes (Ex) Insess Controller. Navigate to AWS Albib Console and get DNS Name To use your Albibo as your API:
Clean
To avoid repeated costs, clean your account after trying a solution:
- From Terreform Folder, delete the Terreform Terreform Terman Terman TermaForm:
terraform apply --destroy - Remove Amazon Bedrock Information Base. From the Amazon Bedrock Console, select the basis of information you created in this solution, select EraseThen follow the steps to remove the basis of information.
Store
In this case, we showed a solution that uses Amazon EX with Amazon Bedrock and provide you with the building framework for your AI AH Ay Ah Aun Ah Aun. Using the basics of Amazon S3 and Amazon Bedrock, our default solution brings your random user data to Amazon Bedrock within the contents of the content. You can use the method indicated in this solution and take your career luggage while using Amazon Bedrock FM for the built-in shipping, based on Bellertes.
For more information about how you can start building with Amazon Bedrock and Ecks for RAG conditions, see the following resources:
About the authors
Kanishk Mahajan Principal, the construction of solutions to AWS. He leads the cloud changes and the construction of the solution to AWS and partners. Ben Shk looks at the containers, the operation of the clouds, the migration, and modern things, AI / ML, stability and safety and security. You are a member of the Field Community Technology (TFC) in each of the background of AWS.
Sandep Batchu Is the highest security artist in the Amazon Web Services, with a broader software engineering experience, solutions, and cyberercere. Passionate regarding the effects of technology, Sandep administrative customers, helping them design and use safe, variable, variable and tightness and clouds.



