February 19, 2026 · 8 min read
How to Build a Link Preview Generator with a Screenshot API
Link previews are everywhere: Slack, Discord, Twitter, Notion. When someone pastes a URL, users expect to see a thumbnail, title, and description. Here's how to build your own link preview service using a screenshot API and meta tag extraction.
What Makes a Good Link Preview?
A complete link preview has three parts:
- Thumbnail - A visual screenshot of the page (or its OG image)
- Title - From the
og:titlemeta tag or<title> - Description - From
og:descriptionormeta[name=description]
The problem: many websites don't set OG images, or their OG images are generic logos. A screenshot API fills that gap by capturing what the page actually looks like.
Architecture Overview
The flow is straightforward:
- User pastes a URL into your app
- Your backend extracts meta tags (title, description, OG image)
- If no OG image exists, call a website screenshot API to generate a thumbnail
- Cache the result and return the preview
You need two API calls at most: one for meta extraction, one for the screenshot. With GrabShot and MetaPeek, both are a single HTTP request each.
Step 1: Extract Meta Tags
First, pull the title, description, and any existing OG image from the target URL.
curl
curl "https://metapeek.grabshot.dev/api/extract?url=https://github.com" \
-H "X-API-Key: YOUR_API_KEY"
Response:
{
"title": "GitHub: Let's build from here",
"description": "GitHub is where over 100 million developers shape the future of software.",
"og_image": "https://github.githubassets.com/assets/social-...",
"favicon": "https://github.githubassets.com/favicons/favicon.svg"
}
Node.js
async function extractMeta(url) {
const res = await fetch(
`https://metapeek.grabshot.dev/api/extract?url=${encodeURIComponent(url)}`,
{ headers: { 'X-API-Key': process.env.METAPEEK_KEY } }
);
return res.json();
}
const meta = await extractMeta('https://github.com');
console.log(meta.title); // "GitHub: Let's build from here"
console.log(meta.og_image); // may be null for some sites
Python
import requests
import os
def extract_meta(url):
resp = requests.get(
"https://metapeek.grabshot.dev/api/extract",
params={"url": url},
headers={"X-API-Key": os.environ["METAPEEK_KEY"]}
)
return resp.json()
meta = extract_meta("https://github.com")
print(meta["title"]) # "GitHub: Let's build from here"
print(meta.get("og_image")) # None if not set
Step 2: Capture a Screenshot (Fallback Thumbnail)
If the page has an OG image, you can use it directly. But when it's missing (or you want a consistent look), capture a website screenshot as the thumbnail.
curl
# Get a 1200x630 screenshot (perfect for link previews)
curl "https://grabshot.dev/api/screenshot?url=https://github.com&width=1200&height=630&format=webp" \
-H "X-API-Key: YOUR_API_KEY" \
--output preview.webp
Node.js
async function capturePreview(url) {
const params = new URLSearchParams({
url,
width: '1200',
height: '630',
format: 'webp'
});
const res = await fetch(
`https://grabshot.dev/api/screenshot?${params}`,
{ headers: { 'X-API-Key': process.env.GRABSHOT_KEY } }
);
return Buffer.from(await res.arrayBuffer());
}
const screenshot = await capturePreview('https://github.com');
// Save to disk, S3, or serve directly
Python
import requests
import os
def capture_preview(url):
resp = requests.get(
"https://grabshot.dev/api/screenshot",
params={
"url": url,
"width": 1200,
"height": 630,
"format": "webp"
},
headers={"X-API-Key": os.environ["GRABSHOT_KEY"]}
)
return resp.content
img = capture_preview("https://github.com")
with open("preview.webp", "wb") as f:
f.write(img)
Step 3: Combine Into a Preview Service
Here's a complete Express.js service that ties it all together. It extracts meta tags, generates a screenshot if needed, and caches results in memory.
import express from 'express';
const app = express();
const cache = new Map();
const CACHE_TTL = 3600_000; // 1 hour
app.get('/preview', async (req, res) => {
const { url } = req.query;
if (!url) return res.status(400).json({ error: 'url parameter required' });
// Check cache
const cached = cache.get(url);
if (cached && Date.now() - cached.time < CACHE_TTL) {
return res.json(cached.data);
}
try {
// Step 1: Extract meta tags
const metaRes = await fetch(
`https://metapeek.grabshot.dev/api/extract?url=${encodeURIComponent(url)}`,
{ headers: { 'X-API-Key': process.env.METAPEEK_KEY } }
);
const meta = await metaRes.json();
// Step 2: Use OG image or generate screenshot
let thumbnail = meta.og_image;
if (!thumbnail) {
const params = new URLSearchParams({
url, width: '1200', height: '630', format: 'webp'
});
thumbnail = `https://grabshot.dev/api/screenshot?${params}`;
// In production, you'd download + upload to your CDN
}
const preview = {
url,
title: meta.title || url,
description: meta.description || '',
thumbnail,
favicon: meta.favicon || null,
source: meta.og_image ? 'og_image' : 'screenshot'
};
cache.set(url, { data: preview, time: Date.now() });
res.json(preview);
} catch (err) {
res.status(500).json({ error: 'Failed to generate preview' });
}
});
app.listen(3000, () => console.log('Preview service on :3000'));
Optimizing for Production
A toy link preview service is easy. A production one needs a few more things:
Cache aggressively
Screenshots don't change often. Cache previews for at least an hour, ideally longer. Use Redis or a CDN edge cache instead of an in-memory Map. This keeps your API costs low and response times fast.
Use the right dimensions
The standard OG image size is 1200x630 pixels (roughly 1.91:1 ratio). This works well on Twitter, Facebook, Slack, and Discord. If you're building for a specific platform, check their recommended sizes:
- Twitter/X: 1200x628
- Facebook: 1200x630
- LinkedIn: 1200x627
- Slack: 800x418 (smaller is fine)
Handle edge cases
Not every URL plays nice. Some things to watch for:
- SPAs - Client-rendered apps need JavaScript execution. A screenshot API handles this (it runs a real browser), but simple HTML scraping won't.
- Paywalls / login walls - Return the screenshot of whatever is visible. Don't try to bypass auth.
- Timeouts - Set a reasonable timeout (10-15 seconds). If the page is slow, return a placeholder.
- Invalid URLs - Validate before making API calls. Check for proper protocol and reachable domains.
Serve WebP for speed
WebP thumbnails are 25-35% smaller than PNG at similar quality. GrabShot supports format=webp natively, so you get smaller payloads without any post-processing.
Use Cases Beyond Chat Apps
Link previews aren't just for messaging. Here's where the same technique works:
- Bookmarking tools - Show thumbnails for saved links (like Raindrop or Pocket)
- CMS and blogs - Auto-generate preview cards when editors paste URLs
- Social media schedulers - Preview how a link will look before posting
- Email builders - Embed rich link previews in marketing emails
- Internal wikis - Make external references more scannable
Why Use an API Instead of Puppeteer?
You could run Puppeteer yourself, but there are real tradeoffs:
| Self-hosted Puppeteer | Screenshot API | |
|---|---|---|
| Setup time | Hours (Chrome, deps, sandbox) | Minutes (one API key) |
| Memory usage | 200-500 MB per browser | ~0 (offloaded) |
| Scaling | You manage concurrency | Handled for you |
| Font rendering | Install fonts manually | Pre-configured |
| Cost at scale | Server costs + maintenance | Per-screenshot pricing |
For prototypes and low-volume use, Puppeteer is fine. For anything serving real users, an API like GrabShot saves you from becoming a browser infrastructure engineer.
Try It Yourself
GrabShot's free tier gives you 25 screenshots per month, which is enough to prototype a link preview service. No credit card required.