Concrete Input-Output Examples

Core

Apply few-shot prompting to improve output consistency and quality · Difficulty 2/5

0%
examplesdata-transformationprompting

When prose descriptions fail to communicate exact requirements, providing concrete input-output examples eliminates ambiguity.

The Problem with Prose

Prose requirements like "normalize the API response" can be interpreted differently on each attempt. Field nesting, date formats, and naming conventions are especially ambiguous in text descriptions.

Solution: Show, Don't Tell

Provide 2-3 representative input-output pairs showing the exact transformation expected:

// Input:
{"created_at": "2024-01-15T10:30:00Z", "user": {"name": "Alice"}}
// Expected Output:
{"createdAt": "Jan 15, 2024", "userName": "Alice"}

Format Demonstration

Few-shot examples should demonstrate the specific desired output format including:

  • Issue location (file, line number)
  • Severity classification
  • Suggested fix
  • Reasoning for the finding
  • This achieves consistency that instructions alone cannot.

    Handling Varied Document Structures

    For extraction from documents with varied formats (inline citations vs bibliographies, methodology sections vs embedded details), provide examples showing correct extraction from each format variant. Include examples with empty/null values for fields not present in the source document.

    Key Takeaways

    • Concrete examples eliminate ambiguity that prose descriptions create
    • Use input-output pairs when instructions produce inconsistent results
    • Include examples covering varied document structures and missing fields
    • 2-3 representative examples are usually sufficient

    Test Yourself1 of 1

    You've asked Claude Code to implement a function that transforms API responses into a normalized format. After two iterations, the output structure still doesn't match expectations. You've been describing requirements in prose. What's the most effective approach for the next iteration?