Back to Prompt Library
implementation
Build Summary & Knowledge Graph Query Engine
Inspect the original prompt language first, then copy or adapt it once you know how it fits your workflow.
Linked challenge: Document AI: Summarize & Extract from Enterprise Content
Format
Text-first
Lines
1
Sections
1
Linked challenge
Document AI: Summarize & Extract from Enterprise Content
Prompt source
Original prompt text with formatting preserved for inspection.
1 lines
1 sections
No variables
0 checklist items
Develop the LlamaIndex query engines for both document summarization and knowledge graph-enhanced querying. For summarization, consider using LlamaIndex's `ResponseSynthesizer` with a `tree_summarize` mode or a custom prompt for Gemini 2.5 Pro to generate a podcast-style script and key highlights. For knowledge graph queries, implement a `KnowledgeGraphQueryEngine` that can answer questions requiring relational understanding by leveraging your MongoDB Atlas Vector Search integration. Provide Python code for setting up these query engines and sample usage.
Adaptation plan
Keep the source stable, then change the prompt in a predictable order so the next run is easier to evaluate.
Keep stable
Hold the task contract and output shape stable so generated implementations remain comparable.
Tune next
Update libraries, interfaces, and environment assumptions to match the stack you actually run.
Verify after
Test failure handling, edge cases, and any code paths that depend on hidden context or secrets.