Skip to content

BA Genie App - Developer Wiki

Table of Contents


Deployment

DEV Deployment

Trigger: Push to main branch

Workflow: .github/workflows/deploy-dev.yml

Process: - Triggered automatically on push to main - Runs SST deploy with stage dev - After successful deployment, runs Prisma database migration

Stage: dev
Environment Name: dev
URL: https://dev.{ROOT_DOMAIN}

Production Deployment

Trigger: Push a tag matching pattern prod-*

Workflow: .github/workflows/deploy-prod.yml

Process: 1. Database Migration - Runs first (Prisma migrations) 2. SST Deploy - Main application deployment 3. Verification - Waits 30 seconds, then verifies the deployment is responding

Tag Pattern: prod-YY.MM.DD-XX - YY - Year (2 digits) - MM - Month (2 digits) - DD - Day (2 digits) - XX - Release number for that day (starting from 01)

Examples: - prod-25.10.20-01 - First release on October 20, 2025 - prod-25.10.20-02 - Second release on October 20, 2025 - prod-25.12.05-01 - First release on December 5, 2025

How to deploy to production:

  1. Create a release via GitHub UI (recommended):
  2. Go to Repository → Releases → Draft a new release
  3. Click Choose a tag → Type new tag (e.g., prod-25.12.05-01) → Create new tag
  4. Set the release title (e.g., Production Release 25.12.05-01)
  5. Add release notes describing the changes
  6. Click Publish release

  7. Or via command line:

    # Create and push the tag
    git tag prod-25.12.05-01
    git push origin prod-25.12.05-01
    
    # Optionally create a GitHub release afterwards via CLI
    gh release create prod-25.12.05-01 --title "Production Release 25.12.05-01" --notes "Release notes here"
    

Stage: production
Environment Name: prod
URL: https://production.{ROOT_DOMAIN}

Preview/Branch Deployment

Trigger: Pull Request (opened, synchronize, reopened)

Workflow: .github/workflows/deploy-preview.yml

Process: - Generates a URL-safe slug from branch name - Deploys to a unique stage: review-{branch-slug}

Stage: review-{slug}
Environment Name: review/{slug}
URL: https://review-{slug}.{ROOT_DOMAIN}

Note: Preview environments are automatically created for each PR and use the same secrets as DEV.


Environment Variables

How Environment Variables Flow to Code

┌─────────────────────────────────────────────────────────────────────────────┐
│                              GitHub                                          │
│  ┌─────────────────────┐     ┌─────────────────────┐                        │
│  │   Repository        │     │   Repository        │                        │
│  │   Secrets           │     │   Variables         │                        │
│  │   (sensitive)       │     │   (non-sensitive)   │                        │
│  └──────────┬──────────┘     └──────────┬──────────┘                        │
└─────────────┼───────────────────────────┼───────────────────────────────────┘
              │                           │
              ▼                           ▼
┌─────────────────────────────────────────────────────────────────────────────┐
│                    GitHub Actions Workflow                                   │
│                    (.github/workflows/_sst-deploy.yml)                       │
│                                                                              │
│  env:                                                                        │
│    AUTH_SECRET: ${{ secrets.AUTH_SECRET }}      ◄── from secrets            │
│    AWS_REGION: ${{ vars.AWS_REGION }}           ◄── from variables          │
│    ROOT_DOMAIN: ${{ vars.ROOT_DOMAIN }}                                      │
└──────────────────────────────┬──────────────────────────────────────────────┘
┌─────────────────────────────────────────────────────────────────────────────┐
│                         SST Deploy (sst.config.ts)                           │
│                                                                              │
│  process.env.AUTH_SECRET      ◄── available via process.env                 │
│  process.env.ROOT_DOMAIN                                                     │
│                                                                              │
│  const environment = {                                                       │
│    DATABASE_URL,                                                             │
│    AUTH_SECRET: process.env.AUTH_SECRET,                                     │
│    ...                                                                       │
│  };                                                                          │
└──────────────────────────────┬──────────────────────────────────────────────┘
              ┌────────────────┼────────────────┐
              ▼                ▼                ▼
┌─────────────────┐  ┌─────────────────┐  ┌─────────────────┐
│  Lambda         │  │  Next.js App    │  │  Step Functions │
│  Functions      │  │                 │  │                 │
│                 │  │  environment:   │  │                 │
│  environment:   │  │    {...}        │  │  (invokes       │
│    {...}        │  │                 │  │   Lambdas)      │
└─────────────────┘  └─────────────────┘  └─────────────────┘

Workflow File Configuration (.github/workflows/_sst-deploy.yml):

# Secrets are passed from calling workflow
secrets:
  AUTH_SECRET: { required: false }
  AUTH_COGNITO_SECRET: { required: false }
  # ...

# Environment variables are set for the job
env:
  # From GitHub Variables (non-sensitive)
  AWS_ACCOUNT_ID: ${{ vars.AWS_ACCOUNT_ID }}
  ROOT_DOMAIN: ${{ vars.ROOT_DOMAIN }}
  AWS_REGION: ${{ vars.AWS_REGION }}

  # From GitHub Secrets (sensitive)
  AUTH_SECRET: ${{ secrets.AUTH_SECRET }}
  AUTH_COGNITO_SECRET: ${{ secrets.AUTH_COGNITO_SECRET }}
  # ...

SST Config (sst.config.ts):

// Environment object passed to Lambda functions and Next.js
const environment = {
  DATABASE_URL,
  AUTH_SECRET: process.env.AUTH_SECRET,
  ROOT_DOMAIN: process.env.ROOT_DOMAIN,
  // ... other env vars
};

// Passed to Lambda functions
new sst.aws.Function('MyLambda', {
  environment,  // ◄── env vars available in Lambda
});

// Passed to Next.js app
new sst.aws.Nextjs('BAGenieApp', {
  environment: {
    ...environment,
    // Additional Next.js specific vars
  },
});

GitHub Secrets

GitHub has two levels of secrets: - Repository Secrets - Available to all workflows (used by DEV & Preview) - Environment Secrets - Scoped to specific environment (used by PROD)

Important: Environment secrets override Repository secrets with the same name. This allows PROD to use different values than DEV/Preview.

Repository Secrets

Settings → Secrets and variables → Actions → Repository secrets

Secret Used By Description
AUTH_SECRET DEV, Preview NextAuth.js secret for session encryption
AUTH_COGNITO_SECRET DEV, Preview AWS Cognito client secret
AZURE_OPENAI_API_KEY DEV, Preview, PROD Azure OpenAI API key
RECALLAI_API_KEY DEV, Preview Recall.ai API key for bot functionality
RECALLAI_WEBHOOK_SECRET DEV, Preview Webhook secret for Recall.ai callbacks

Environment Secrets - prod

Settings → Environments → prod → Environment secrets

Secret Description
AUTH_SECRET Production NextAuth.js secret (overrides repository)
AUTH_COGNITO_SECRET Production Cognito client secret (overrides repository)
AUTH_MICROSOFT_ENTRA_ID_SECRET Microsoft Entra ID OAuth secret (PROD only)
RECALLAI_API_KEY Production Recall.ai API key (overrides repository)
RECALLAI_WEBHOOK_SECRET Production webhook secret (overrides repository)

Note: AZURE_OPENAI_API_KEY is only in Repository secrets and is shared across all environments.

GitHub Variables

These are non-sensitive configuration values stored in GitHub Variables:

Variable Description
AWS_ACCOUNT_ID AWS Account ID for deployment
ROOT_DOMAIN Base domain (e.g., ba-genie.dev.bytemethod.ai)
AWS_REGION AWS Region (e.g., us-east-1)
AUTH_COGNITO_ID Cognito User Pool App Client ID
AUTH_COGNITO_ISSUER Cognito Issuer URL
RECALLAI_REGION Recall.ai region
AUTH_MICROSOFT_ENTRA_ID_ID PROD only - Microsoft Entra ID Client ID
AUTH_MICROSOFT_ENTRA_ID_ISSUER PROD only - Microsoft Entra ID Issuer URL

Debugging & Logs

Bot Workflow (Step Function)

Resource Name Pattern: {app-name}-{stage}-BotSFN
Example: ba-genie-app-dev-BotSFN

How to find logs in AWS Console: 1. Go to AWS Console → Step Functions 2. Search for BotSFN in the state machines list 3. Select the state machine matching your stage (e.g., ba-genie-app-dev-BotSFN) 4. Click on Executions tab to see all executions 5. Click on a specific execution to see: - Execution graph with step-by-step status - Input/Output for each step - Error details if failed

Bot SFN Flow:

Start Bot → Process Recording → Process Transcription → Convert Transcription → Generate Meeting Notes

Related Lambda Functions (check CloudWatch Logs): - StartBotLambda - Starts bot and waits for recording - ProcessRecordingLambda - Processes recording and waits for transcript - ProcessTranscriptionLambda - Converts transcription to BA Genie format - ConvertTranscriptionLambda - Converts transcription to input file

Document Processing (Step Function)

Resource Name Pattern: ba-genie-app-{stage}-ProcessDocumentsSFN
Example: ba-genie-app-dev-ProcessDocumentsSFN

How to find logs in AWS Console: 1. Go to AWS Console → Step Functions 2. Search for ProcessDocumentsSFN in the state machines list 3. Select the state machine matching your stage 4. Click on Executions tab

Document Processing SFN Flow:

Init → Execute Service (waits for task token) → Success Action → Success
                 ↓ (on error)
            Failure Action → Failure

Related Lambda Functions (check CloudWatch Logs): - InitLambda - Initializes document processing - SuccessActionLambda - Handles successful processing - FailureActionLambda - Handles failures - MeetingNotesLambda - Generates meeting notes - RequirementsBacklogLambda - Generates requirements backlog - DiscoveryDocLambda - Generates discovery documents - ProcessFlowLambda - Generates process flows - DocumentChunkingLambda - Chunks large documents

To find Lambda logs: 1. Go to AWS Console → CloudWatch → Log Groups 2. Search for /aws/lambda/{function-name} (e.g., /aws/lambda/ba-genie-app-dev-InitLambda)

Email Invite Handler

Resource Name Pattern: {app-name}-{stage}-EmailReceivedSnsHandler
Example: ba-genie-app-dev-EmailReceivedSnsHandler

How to find logs in AWS Console: 1. Go to AWS Console → CloudWatch → Log Groups 2. Search for EmailReceivedSnsHandler 3. Select the log group matching your stage

Email Processing Flow:

Email received → SES → S3 (EmailInviteS3) → SNS (EmailReceivedSNS) → Lambda (processEmailInvite) → Bot SFN

Related Resources: - S3 Bucket: EmailInviteS3 - Stores incoming emails (auto-deleted after 7 days) - SNS Topic: EmailReceivedSNS - Triggers on new email - Lambda Handler: functions/email.processEmailInvite


Local Development

Testing Step Functions with SST Dev

How it works: 1. SST deploys infrastructure to AWS (including Step Functions) 2. Lambda functions are proxied to your local machine 3. Step Functions execute in AWS but invoke your local Lambda code 4. Logs appear in your terminal in real-time

Steps:

  1. Start SST Dev:

    npx sst dev
    

  2. Trigger the Step Function:

  3. Bot SFN: Send an email invite or trigger via the UI
  4. Document Processing SFN: Upload a document via the UI
  5. View logs: Lambda execution logs appear in your terminal

  6. View Step Function execution in AWS Console:

  7. Go to AWS Console → Step Functions
  8. Find your state machine (e.g., ba-genie-app-{your-username}-BotSFN)
  9. Click on the execution to see the graph, input/output for each step, and errors

Useful Commands:

# Start dev mode
npx sst dev

Prerequisites: - AWS credentials configured with bagenie profile (see sst.config.ts) - Copy .env.example to .env.local and fill in required values - Set STAGE in .env.local to your name/nickname (e.g., STAGE=daniel) - this helps identify your resources in AWS Console (e.g., ba-genie-app-daniel-BotSFN)


Versions/Features

Baseline v1 - Core Bot & Document Generation

Overview

The foundational release of BA Genie includes automated meeting bot invitations, recording/transcription processing, and intelligent document generation from meeting content.

Features

1. Bot Invitation & Meeting Capture

Email-Based Bot Invitations - Users forward calendar invites to genie.bot@{stage}.{ROOT_DOMAIN} - System parses ICS files to extract meeting details (time, link, participants) - Automatically schedules Recall.ai bot to join meetings (Zoom, Google Meet, Teams) - Supports both immediate and future meeting scheduling

Manual Meeting Start - Direct meeting URL input via UI - Instant bot deployment to ongoing meetings - Real-time meeting validation

Architecture Flow:

User forwards calendar invite
SES receives email → S3 storage
SNS triggers Lambda (processEmailInvite)
Parse ICS with node-ical
Create Meeting record in database
Start BotSFN Step Function
Recall.ai bot joins meeting at scheduled time
Records video/audio + transcription
Webhook delivers recording & transcript to S3

Key Components: - Email Handler (functions/email.ts): Parses calendar invites, manages meeting lifecycle (create/update/cancel) - Bot Lambda (functions/bot.ts): Starts Recall.ai bot with task token, configures recording settings - BotSFN (infra/bot.ts): Orchestrates entire bot workflow with Step Functions - Webhook Handler (app/api/webhook/route.ts): Receives Recall.ai events (recording.done, transcript.done)

Supported Meeting Platforms: - Zoom - Google Meet
- Microsoft Teams


2. Process Flow Document Generation

Purpose: Automatically generates visual process flow diagrams and documentation from meeting transcripts.

Generator: ProcessFlowGenerator (lib/services/generators/process-flow.ts)

Input: Meeting transcripts with business process discussions

Output: Markdown document with: - Process overview and objectives - Step-by-step workflow descriptions - Decision points and branching logic - Actor/role descriptions - System interactions - Exception handling flows

AI Model: Uses Bedrock Claude to analyze conversations and extract structured process flows

Storage: Generated documents saved to ProjectsS3 bucket at projects/{projectId}/process-flow-{timestamp}.md

Lambda Handler: ProcessFlowLambda (functions/document-processing.ts)

Trigger: User-initiated via project UI after meeting notes are available


3. Discovery Document Generation

Purpose: Creates comprehensive discovery documentation capturing requirements, context, and business needs discussed in meetings.

Generator: DiscoveryGenerator (lib/services/generators/discovery.ts)

Input: Meeting transcripts and existing project context

Output: Structured markdown document with: - Business context and background - Stakeholder identification - Problem statements - Solution requirements - Constraints and assumptions - Success criteria - Open questions and risks

AI Model: Bedrock Claude analyzes transcripts to extract discovery information

Storage: Saved to ProjectsS3 at projects/{projectId}/discovery-doc-{timestamp}.md

Lambda Handler: DiscoveryDocLambda (functions/document-processing.ts)

Trigger: User-initiated via project UI


4. Requirements Backlog Generation

Purpose: Automatically extracts and organizes user stories, epics, and technical requirements from meeting discussions.

Generator: RequirementsBacklogGenerator (lib/services/generators/requirements-backlog.ts)

Input: Meeting transcripts and project documentation

Output: Markdown document with: - Epics (high-level features) - User stories with acceptance criteria - Technical requirements - Non-functional requirements - Dependencies between requirements - Priority indicators

Structure:

{
  "epics": [
    {
      "epicName": "Epic Title",
      "description": "...",
      "userStories": [
        {
          "storyTitle": "As a user...",
          "acceptanceCriteria": ["..."],
          "priority": "high"
        }
      ]
    }
  ]
}

AI Model: Bedrock Claude with specialized prompts for requirement extraction

Storage: Saved to ProjectsS3 at projects/{projectId}/requirements-backlog-{timestamp}.md

Lambda Handler: RequirementsBacklogLambda (functions/document-processing.ts)

Trigger: User-initiated via project UI


Step Function Flow (ProcessDocumentsSFN)

Init
Execute Service (waits for task token)
  - MeetingNotesLambda / ProcessFlowLambda / DiscoveryDocLambda / RequirementsBacklogLambda
Success Action
  - Update database
  - Notify via WebSocket
Success

(On Error)
Failure Action
  - Log error
  - Notify user
Failure

Common Features Across All Generators: - Streaming AI responses with progress updates via WebSocket - S3 storage with timestamped filenames - Database records linking documents to projects - Error handling with detailed logging - Retry logic for transient failures


Resource Names (v1)

  • Bot Step Function: ba-genie-app-{stage}-BotSFN
  • Document Processing SFN: ba-genie-app-{stage}-ProcessDocumentsSFN
  • Email Handler Lambda: ba-genie-app-{stage}-EmailReceivedSnsHandler
  • Process Flow Lambda: ba-genie-app-{stage}-ProcessFlowLambda
  • Discovery Doc Lambda: ba-genie-app-{stage}-DiscoveryDocLambda
  • Requirements Lambda: ba-genie-app-{stage}-RequirementsBacklogLambda

v2 - Video Processing & Manual Transcripts

Overview

Version 2 expands input capabilities by supporting direct video/audio uploads and manual transcript uploads, eliminating the dependency on live meeting bots for all workflows.

New Features

1. Video/Audio Upload Processing

Purpose: Process pre-recorded meetings or videos without requiring a live bot to join.

Supported Formats: - Video: MP4, MOV, AVI, WebM - Audio: MP3, WAV, M4A, OGG

Workflow:

User uploads video/audio file via UI
File stored in MeetingsS3
VideoUploadSFN triggered
ProcessVideoUploadLambda
  - Generates presigned S3 URL (12hr expiry)
  - Submits to external transcription API
  - Polls for completion (max 30 min timeout)
Receives transcription with speaker diarization
Converts to BA Genie transcript format
Stores in S3 as JSON
ConvertTranscription → MeetingNotes → Email flow

Lambda Handler: functions/video-processing.ts

Key Features: - Speaker Diarization: Automatically identifies and labels different speakers - Timestamp Accuracy: Preserves precise timestamps for each utterance - Long Video Support: 12-hour presigned URLs for processing lengthy recordings - Polling Mechanism: Checks transcription status every 30 seconds with exponential backoff - Chunk Processing: Handles large videos by processing in chunks

External API Integration: - Uses custom transcription service (configured via TRANSCRIPTION_API_KEY_HASHED) - Supports multi-chunk processing for videos >1 hour - Returns structured JSON with speaker labels and timestamps

Step Function: VideoUploadSFN (infra/bot.ts)

ProcessVideoUpload → ConvertTranscription → GenerateMeetingNotes → GenerateSummary → SendEmail

Error Handling: - Validates file size and format before processing - Retries failed API calls up to 3 times - Provides detailed error messages to users via WebSocket - Automatic cleanup of failed processing artifacts


2. Manual Transcript Upload Processing

Purpose: Allow users to upload existing transcripts in various formats without needing video/audio files.

Supported Formats: - Plain text (.txt) - JSON (.json) - BA Genie or generic transcript format - WebVTT (.vtt) - Video subtitle format - SubRip (.srt) - Subtitle format - DOCX (.docx) - Word documents (basic text extraction)

Workflow:

User uploads transcript file via UI
File stored in MeetingsS3
ManualUploadSFN triggered
ProcessManualUploadLambda
  - Detects file format from extension
  - Parses content based on format
  - Converts to BA Genie Transcription format
Stores standardized JSON in S3
ConvertTranscription → MeetingNotes → Email flow

Lambda Handler: functions/manual-upload.ts

Format-Specific Parsing:

Plain Text (.txt): - Splits by double newlines or speaker patterns - Detects speaker labels (e.g., "John:", "Speaker 1:") - Assigns sequential timestamps - Fallback: treats entire content as single speaker

JSON (.json):

// Supported structure:
{
  "transcription": [
    {
      "timestamp": 0,
      "text": "Hello world",
      "speaker": "John Doe"
    }
  ]
}

WebVTT (.vtt):

00:00:10.000 --> 00:00:15.000
Speaker 1: This is the text

Parsed into timestamped blocks with speaker labels

SubRip (.srt):

1
00:00:10,000 --> 00:00:15,000
Speaker 1: This is the text

Converts SRT timestamps to seconds

DOCX (.docx): - Uses mammoth library for text extraction - Strips formatting, keeps plain text - Processes as plain text with speaker detection

Speaker Detection Algorithm: 1. Searches for patterns: Name:, Speaker N:, [Name] 2. Extracts speaker label and dialogue 3. Maintains speaker consistency across blocks 4. Falls back to "Unknown Speaker" if no pattern found

Timestamp Handling: - If no timestamps in source: generates sequential timestamps (0, 10, 20, 30...) - If timestamps exist: preserves original timing - Converts various time formats (HH:MM:SS, MM:SS.SSS, seconds) to uniform seconds format

Step Function: ManualUploadSFN (infra/bot.ts)

ProcessManualUpload → ConvertTranscription → GenerateMeetingNotes → GenerateSummary → SendEmail

Validation: - Minimum content length check (prevents empty files) - Format validation (ensures parseable content) - Speaker detection quality check - Warns users if transcript quality is low


Unified Transcription Format

All processing methods (Bot, Video, Manual) convert to this standard format:

interface Transcription {
  transcription: TranscriptionBlock[];
}

interface TranscriptionBlock {
  timestamp: number;        // seconds from start
  text: string;             // spoken text
  speaker?: string;         // speaker identifier
  speakerIndex?: number;    // numeric speaker ID
}

This standardization enables: - Consistent meeting notes generation - Uniform document processing - Shared processing pipeline (ConvertTranscription → MeetingNotes)


Resource Names (v2)

  • Video Upload SFN: ba-genie-app-{stage}-VideoUploadSFN
  • Manual Upload SFN: ba-genie-app-{stage}-ManualUploadSFN
  • Video Processing Lambda: ba-genie-app-{stage}-ProcessVideoUploadLambda
  • Manual Upload Lambda: ba-genie-app-{stage}-ProcessManualUploadLambda
  • Convert Transcription Lambda: ba-genie-app-{stage}-ConvertTranscriptionLambda

v3 - Email, Meeting Summary and delete capability

Overview

After meeting notes are generated, the system automatically: 1. Generates an AI-powered summary of the meeting notes using Bedrock Claude 2. Sends an email to the user with the summary in the body and meeting notes as DOCX attachment

Architecture Flow

Meeting Notes Generated
Generate Meeting Summary Lambda
  - Fetches meeting notes from S3
  - Generates AI summary using Bedrock Claude
  - Stores metadata in S3
Send Meeting Email Lambda
  - Fetches metadata from S3
  - Converts MD to DOCX
  - Sends email via SES with attachment

Components

Lambda Functions

functions/meeting-summary.ts - Handler: generateSummary - Input: { meetingId: string } - Process: - Fetches meeting from database (includes user email) - Gets meeting notes location from meeting.data.externalData.meetingNotes - Fetches notes content from S3 - Calls Bedrock Claude (us.anthropic.claude-sonnet-4-20250514-v1:0) to generate 3-5 bullet point summary - Stores email metadata in S3 at meetings/{meetingId}/email-metadata.json - Timeout: 5 minutes - Permissions: - S3: GetObject, PutObject - Bedrock: InvokeModel, InvokeModelWithResponseStream

functions/send-meeting-email.ts - Handler: sendEmail - Input: { meetingId: string } - Process: - Fetches email metadata from S3 - Downloads meeting notes from S3 - Converts Markdown to DOCX (if not already available) - Sends email via SES using sendMeetingNotesEmail() - Updates meeting record with emailSent data - Timeout: 2 minutes - Permissions: - S3: GetObject - SES: SendEmail, SendRawEmail

Step Functions Integration

All three Step Functions include the email flow at the end:

  • BotSFN: startBot → processRecording → processTranscription → convertTranscription → meetingNotes → generateMeetingSummary → sendMeetingEmail
  • ManualUploadSFN: processManualUpload → convertTranscription → meetingNotes → generateMeetingSummary → sendMeetingEmail
  • VideoUploadSFN: processVideoUpload → convertTranscription → meetingNotes → generateMeetingSummary → sendMeetingEmail

Data Flow Through Step Functions

meetingNotesSfnInvoke (nested SFN)
  output: { meetingId }
generateMeetingSummaryLambdaInvoke
  payload: { meetingId }
  → Stores: meetings/{meetingId}/email-metadata.json
  output: { meetingId, success }
sendMeetingEmailLambdaInvoke
  payload: { meetingId }
  → Reads: meetings/{meetingId}/email-metadata.json
  output: { success, messageId }

Email Service

lib/services/email-sender.ts - Function: sendMeetingNotesEmail() - Sender: noreply@ba-genie.itsaiplatform.com - Format: MIME multipart with DOCX attachment - Region: us-east-1 (SES)

AWS SES Setup

Domain Verification

  1. Add domain identity in SES Console for ba-genie.itsaiplatform.com
  2. Add DNS records:
  3. DKIM: 3 CNAME records (provided by SES)
  4. SPF: TXT record v=spf1 include:amazonses.com ~all
  5. DMARC: TXT record v=DMARC1; p=quarantine; rua=mailto:dmarc@ba-genie.itsaiplatform.com

Sandbox vs Production Mode

Sandbox Mode (Development): - Both sender AND recipient must be verified - Verify recipient emails: SES Console → Verified Identities → Create Identity → Email - Limited sending quota

Production Mode: - Request production access in SES Console to remove sandbox restrictions - Can send to any email address - Higher sending quotas

Email Metadata Structure (S3)

Stored at s3://meetings-bucket/meetings/{meetingId}/email-metadata.json:

{
  "meetingId": "uuid",
  "summary": "• Point 1\n• Point 2\n• Point 3",
  "notesBucket": "bucket-name",
  "notesKey": "meetings/uuid/meeting-notes.md",
  "recipientEmail": "user@example.com",
  "meetingTitle": "Meeting Subject",
  "generatedAt": "2026-02-23T10:00:00.000Z"
}

Debugging Email Flow

Check if email was sent:

# View Lambda logs
aws logs tail /aws/lambda/ba-genie-app-{stage}-SendMeetingEmailLambdaFunction --follow

# Check SES sending statistics
aws ses get-send-statistics --region us-east-1

# Manually trigger email resend
aws lambda invoke \
  --function-name ba-genie-app-{stage}-SendMeetingEmailLambdaFunction \
  --region us-east-1 \
  --payload '{"meetingId": "your-meeting-id"}' \
  --cli-binary-format raw-in-base64-out \
  /tmp/response.json

Common Issues:

  1. Email not received:
  2. Check SES Console → Sending Statistics for delivery status
  3. Verify recipient email is verified (sandbox mode)
  4. Check spam/junk folder
  5. Check sender domain matches verified domain (noreply@ba-genie.itsaiplatform.com)
  6. Check CloudWatch logs for errors

  7. Summary generation fails:

  8. Check Bedrock model access: us.anthropic.claude-sonnet-4-20250514-v1:0
  9. Verify meeting notes exist in S3
  10. Check CloudWatch logs: /aws/lambda/ba-genie-app-{stage}-GenerateMeetingSummaryLambdaFunction

  11. Step Function fails:

  12. Check Step Function execution graph in AWS Console
  13. Verify meetingId flows through all steps
  14. Check individual Lambda CloudWatch logs

Resource Names: - Generate Summary Lambda: ba-genie-app-{stage}-GenerateMeetingSummaryLambdaFunction - Send Email Lambda: ba-genie-app-{stage}-SendMeetingEmailLambdaFunction - CloudWatch Logs: /aws/lambda/ba-genie-app-{stage}-{FunctionName}


Delete Capability

Purpose: Provide administrative control and workspace management through comprehensive delete functionality across the application.

Supported Delete Operations:

1. Delete Meetings - Single/Bulk: Delete individual meetings or multiple selected meetings - Action: lib/actions/meetings.tsdeleteMeetings() - Scope: Hard delete from database - Cleanup: - Deletes meeting records - Deletes associated S3 objects (meeting notes, transcripts, summaries) - Removes meeting-project associations - Deletes related documents (if meeting is the sole source) - Deletes document chunks for affected documents - UI: Meeting table row actions and bulk actions toolbar - Trigger: User-initiated via three-dot menu or bulk selection

2. Delete Projects - Action: lib/actions/projects.tsdeleteProject() - Scope: Soft delete (sets deletedAt timestamp) - Cleanup: - Soft deletes the project - Soft deletes all project documents - Soft deletes all document chunks - Soft deletes all meeting-project associations - Authorization: Only project owner can delete - UI: Project card dropdown menu and context menu - Trigger: User-initiated via three-dot menu or right-click

3. Delete Project Documents (Inputs/Outputs) - Single Delete: lib/actions/documents.tsdeleteProjectDocument() - Bulk Delete: lib/actions/documents.tsdeleteProjectDocuments() - Scope: Soft delete for database records - Cleanup: - Soft deletes document records - Soft deletes associated document chunks - Deletes S3 files (only for non-meeting documents) - Unlinks meeting from project if document source is a meeting - UI: Project document table row actions and bulk actions - Trigger: User-initiated for both input and output documents

Feature Flag Control:

// lib/utils/constants.ts
export const FRONTEND_DELETE_ENABLED = process.env.NEXT_PUBLIC_ENABLE_DELETE === 'true';

Delete Flow Architecture:

User initiates delete
DeleteConfirmationModal shown
User confirms deletion
Server action invoked (deleteMeetings/deleteProject/deleteProjectDocument)
Database transaction
  - Update deletedAt timestamps (soft delete)
  - OR hard delete records (meetings)
S3 cleanup (if applicable)
  - Delete objects via DeleteObjectCommand/DeleteObjectsCommand
Cascade deletions
  - Document chunks
  - Associations (meeting-project)
Revalidate paths
Toast notification to user

Key Components:

DeleteConfirmationModal (components/modals/delete-confirmation-modal.tsx): - Reusable confirmation dialog - Shows entity-specific title and message - Displays loading state during deletion - Used across meetings, projects, and documents

Meeting Deletion Details: - Collects S3 objects from meeting data: - externalData.meetingNotes (meeting notes file) - externalData.markdownTranscription (transcript file) - externalData.meetingSummary (summary file) - Groups S3 objects by bucket for batch deletion - Finds and deletes documents where meeting is sole source - Hard deletes from database (not soft delete) - Removes meeting-project associations

Project Deletion Details: - Verifies user ownership before deletion - Performs cascading soft deletes in transaction: 1. Document chunks for all project documents 2. All documents in the project 3. Meeting-project associations 4. Project record itself - Does NOT delete S3 files (preserves data) - Uses timestamps for audit trail

Document Deletion Details: - Distinguishes between meeting-sourced and user-uploaded documents - Only deletes S3 files for user-uploaded documents - Meeting-sourced documents: S3 files retained, only DB record soft deleted - Unlinks meeting from project if document is meeting-derived - Updates project activity timestamp

Best Practices: - Always show confirmation modal before deletion - Provide clear feedback via toast notifications - Use transactions for multi-step deletions - Revalidate paths after successful deletion - Log deletion operations for debugging - Distinguish between soft delete (audit trail) vs hard delete (cleanup)

Debugging:

# Check deleted records (soft delete)
# Query Prisma with where: { deletedAt: { not: null } }

# Check S3 cleanup
aws s3 ls s3://{bucket-name}/meetings/{meetingId}/

# View deletion logs
aws logs tail /aws/lambda/ba-genie-app-{stage}-{FunctionName} --follow

Security: - All delete operations require authentication - Project deletion requires ownership verification - Server-side validation prevents unauthorized deletions - Feature flag allows disabling delete UI per environment


Quick Reference

Resource AWS Console Path Search Term
Bot SFN Step Functions → State machines BotSFN
Document Processing SFN Step Functions → State machines ProcessDocumentsSFN
Email Handler CloudWatch → Log Groups EmailReceivedSnsHandler
Any Lambda CloudWatch → Log Groups /aws/lambda/{stage}-{name}
S3 Buckets S3 → Buckets {stage}- prefix

Troubleshooting

Step Function Stuck

  • Check execution in AWS Console → Step Functions
  • Look for tasks waiting for task token
  • Check corresponding Lambda logs in CloudWatch

Email Not Processing

  • Check Lambda logs for EmailReceivedSnsHandler
  • Check SNS topic subscription
  • Verify S3 bucket received the email
  • Check SES → Email Receiving for rule status

Document Processing Failed

  • Check Step Function execution graph
  • Look at the failed step's input/output
  • Check Lambda logs for the specific Lambda that failed
  • Look for timeout issues (default 15 minutes for processing Lambdas)