Back to Prompt Library
implementation

Initial Project Setup and AI SDK Integration

Inspect the original prompt language first, then copy or adapt it once you know how it fits your workflow.

Linked challenge: AI-Powered Regulatory Drafting Assistant

Format
Code-aware
Lines
16
Sections
7
Linked challenge
AI-Powered Regulatory Drafting Assistant

Prompt source

Original prompt text with formatting preserved for inspection.

16 lines
7 sections
No variables
1 code block
Set up a new Next.js project and integrate the Vercel AI SDK. Initialize a client for Gemini 2.5 Pro. Create a basic chat component that can send prompts to the model and display streaming responses in the UI. Ensure your project structure supports future integration of structured output validation and voice features.

```typescript
// app/api/chat/route.ts
import { GoogleGenerativeAI } from '@google/generative-ai';
import { GoogleGenerativeAIStream, Message, StreamingTextResponse } from 'ai';

const genai = new GoogleGenerativeAI(process.env.GOOGLE_API_KEY || '');

export const runtime = 'edge';

export async function POST(req: Request) {
  const { messages } = await req.json();

  const model = genai.getGenerativeModel({ model: 'gemini-pro' });
  const stream = await model.generateContentStream({
    contents: messages.map((m: Message) => ({ role: m.role, parts: [{ text: m.content }] })),
  });

  return new StreamingTextResponse(GoogleGenerativeAIStream(stream));
}
```

Adaptation plan

Keep the source stable, then change the prompt in a predictable order so the next run is easier to evaluate.

Keep stable

Hold the task contract and output shape stable so generated implementations remain comparable.

Tune next

Update libraries, interfaces, and environment assumptions to match the stack you actually run.

Verify after

Test failure handling, edge cases, and any code paths that depend on hidden context or secrets.