⚡ Text Summariser
About this demo
Drop a plain-text file and get exactly five bullet-point
takeaways in a few seconds. Everything runs on a fully serverless
stack—there are no EC2 instances to patch or keep warm.
- AWS Lambda – one Python function that calls the LLM and writes results to S3.
- Amazon API Gateway (HTTP API) – HTTPS endpoint the browser hits.
- Amazon Bedrock – choose the model on the fly:
Titan-Lite (default), Cohere Command-Light, or Claude 3 Haiku.
- Amazon S3 – stores your original file under
raw/
and the summary under summary/
.
- AWS SAM – one template deploys the whole lot.
What it’s good for
- Quick content previews before you dive into a long article.
- Batch-summarising many files by pushing them to the
raw/
folder.
- Plug-and-play demo of how to wire Bedrock into a serverless workflow.
Current limitations
- Text only (<200 KB per file). No PDFs or HTML yet.
- One language at a time—auto-detect works but bullets come back in English.
- Your Bedrock account is billed for tokens; Haiku costs more than Titan/Cohere.
Future potential: swap in larger models, extend to PDFs via Textract,
or trigger downstream workflows (Slack alerts, vector search, etc.).
Source code on GitHub.