As a software engineer working in healthcare, Iβve seen first-hand how patients want instant answers but compliance makes everything tricky. We canβt just throw PHI (Protected Health Information) into ChatGPT and call it a day.
So I decided to build a HIPAA-friendly chatbot using AWS Lambda (serverless backend) and Amazon Bedrock (LLM service) β with data masking to keep sensitive info safe.
π This post walks you through:
- Why chatbots in healthcare are challenging π
- How to design a compliant architecture π
- Code snippets for AWS Lambda + Bedrock βοΈ
- Tips for keeping PHI secure π‘οΈ
π¨ The Challenge: AI + Healthcare = Risk
- Chatbots are great for FAQs, triage, and scheduling.
- But if you send raw PHI (like names, MRNs, diagnoses) to an LLMβ¦ thatβs a compliance nightmare.
- HIPAA requires minimum necessary access and strict controls.
ποΈ Architecture at a Glance
Flow:
- Patient message β API Gateway
- Lambda pre-processor β scrubs PHI (names, DOB, SSNs)
- Bedrock LLM β processes the masked query
- Lambda post-processor β reinserts placeholders if needed
- Response β returned to patient securely
API Gateway β Lambda (mask PHI) β Bedrock β Lambda (restore placeholders) β Patient
βοΈ Step 1: Setting up AWS Lambda
A basic Python Lambda handler:
import boto3
import re
bedrock = boto3.client('bedrock-runtime')
def mask_phi(text):
# toy example: replace dates + names
text = re.sub(r'\d{2}/\d{2}/\d{4}', '[DATE]', text)
text = re.sub(r'\b(Alice|Bob|John)\b', '[NAME]', text)
return text
def lambda_handler(event, context):
user_input = event['queryStringParameters']['q']
masked_input = mask_phi(user_input)
response = bedrock.invoke_model(
modelId="anthropic.claude-v2",
contentType="application/json",
accept="application/json",
body=f'{{"prompt":"{masked_input}"}}'
)
return {
"statusCode": 200,
"body": response['body'].read().decode('utf-8')
}
π€ Step 2: Talking to Bedrock Safely
- Always send masked input only.
- Example:
Input:
John Smith has a fever since 09/21/2025. Should he see a doctor?
Masked:
[NAME] has a fever since [DATE]. Should they see a doctor?
π Step 3: Post-Processing
If you need to restore placeholders (like Hello [NAME]), you can map them back safely from session state.
β Why This Matters
- Patients get instant responses
- Engineers stay HIPAA-compliant
- Serverless (Lambda) keeps costs low
- Bedrock provides enterprise-grade LLMs without exposing PHI
π Next Steps
- Add DynamoDB to store chat history (encrypted)
- Plug in Cognito for authentication
- Expand PHI scrubbing with Amazon Comprehend Medical
π Closing Thoughts
As engineers, we often think "just ship the feature" β but in healthcare, privacy is the feature.
This project taught me that itβs possible to marry AI innovation with compliance if we design carefully.
π π Repo link: GitHub β hipaa-chatbot-bedrock

Top comments (0)