7 min left
0% complete
How We Built Link Previews Without Breaking the Internet
Imagine pasting a YouTube link into a message and instantly seeing a rich preview with the video’s title, thumbnail, and description—no extra work required. That’s the magic of link previews, and it’s
Imagine pasting a YouTube link into a message and instantly seeing a rich preview with the video’s title, thumbnail, and description—no extra work required. That’s the magic of link previews, and it’s exactly what we wanted to bring to our platform when influencers share product links.
Instead of asking users to manually upload images or write descriptions for every URL they add, we use Open Graph (OG) metadata to automatically generate beautiful, informative previews—just like WhatsApp, X (formerly Twitter), or Slack do when you paste a link.
In this post, I’ll walk you through how we built this feature the right way—by doing the heavy lifting on the backend—and why that decision was critical for performance, security, and reliability.
How Link Previews Work: The Big Picture
At its core, the link preview system follows a simple flow:
- An influencer adds a link (e.g., to a YouTube video or an Amazon product).
- The frontend asks the backend to fetch that URL.
- The backend visits the page, reads its Open Graph tags, and extracts metadata like
og:title,og:image, andog:description. - The backend returns a clean preview object.
- The frontend displays a rich card with the title, image, and site name.
Here’s the high-level flow:
Client → Backend → Target URL → Parse OG tags → Return preview → Client rendersThis automation removes friction for creators while giving audiences more context—turning plain links into engaging, visual experiences.
Why the Backend Does the Heavy Lifting
You might wonder: Why not just have the client (browser) fetch the URL directly? After all, it’s simpler.
But here’s the catch: browsers block cross-origin requests. If your app tries to fetch https://youtube.com/watch?v=abc123 from the frontend, YouTube’s server will reject the request unless it explicitly allows your domain via CORS headers—and most big sites don’t.
So we do it server-side. And as it turns out, this architecture choice unlocks several major benefits.
🔒 Security: Preventing Malicious Requests
Letting the client fetch arbitrary URLs opens the door to Server-Side Request Forgery (SSRF) attacks. A malicious actor could trick the system into accessing internal services (like http://localhost:8080/admin) by disguising them as external URLs.
By handling the fetch on the backend, we can:
- Validate URLs (e.g., ensure they use
http://orhttps://) - Block private IP ranges and internal domains
- Sanitize input before making any external calls
This layer of control keeps our infrastructure—and our users—safe.
💾 Caching: One Fetch, Millions of Views
Imagine 10,000 users viewing the same YouTube link. If each browser fetches the preview independently, that’s 10,000 identical requests to YouTube. That’s inefficient—and could get us rate-limited or blocked.
When the backend handles the fetch, we can cache the result and serve it to everyone. We use a 24-hour TTL (time to live) because OG data rarely changes. One request benefits thousands.
Compare that to client-side caching, which is isolated to individual devices—no sharing, no efficiency.
📈 Reliability: Keeping the UI Fast and Responsive
Client networks are unpredictable—mobile connections drop, Wi-Fi stutters. If the frontend waited for a slow site to respond, the UI might freeze.
With backend fetching:
- We set a 5-second timeout on external requests
- Failed or slow requests return gracefully (e.g., show just the domain)
- The rest of the page loads unaffected
This ensures a smooth experience even when third-party sites are misbehaving.
✅ Consistency: Everyone Sees the Same Preview
We want all users to see consistent previews for the same URL—no variation based on location, network conditions, or device.
Backend fetching guarantees that. Whether you're in Tokyo or Toronto, the preview comes from a single, canonical source.
Comparison at a Glance
| Concern | Client Fetch | Backend Fetch |
|---|---|---|
| CORS Restrictions | ❌ Blocked by browser | ✅ No issues |
| Security | ❌ Risk of SSRF, no control | ✅ Validated, safe |
| Caching Efficiency | ❌ Per-user, repetitive | ✅ Shared globally |
| Network Reliability | ❌ Depends on user connection | ✅ Stable server |
| Consistency | ❌ May vary by user | ✅ Uniform across users |
Spoiler: Backend wins by a mile.
Technical Implementation
Now that we’ve covered the why, let’s dive into the how.
GraphQL API Endpoints
We expose two key queries:
1. Get Item Details
query Item($id: String!) {
item(id: $id) {
additionalLinks {
url
type
}
}
}This returns the list of links attached to an item.
2. Fetch Link Preview
query LinkPreview($url: String!) {
linkPreview(url: $url) {
url
title
description
image
siteName
favicon
}
}Called once per link, this triggers the backend to fetch and parse OG metadata.
Click here to see how I did backend implementation
Frontend Integration
On the frontend, we use a React Query hook to manage loading states and caching:
function LinkPreviewCard({ link }) {
const { data: preview, isLoading } = useLinkPreview(link.url);
if (isLoading) {
return <SkeletonCard />;
}
return (
<Card>
<img src={preview.image || ""} alt="Preview" />
<h3>{preview.title}</h3>
<p>{preview.description}</p>
<span>{preview.siteName}</span>
</Card>
);
}While loading, we show a skeleton loader for a polished feel—but we always display the domain immediately since it’s already part of the link data.
We also cache previews on the client for 1 hour using React Query, reducing redundant network calls during a single session.
Backend Responsibilities
The backend handles the core logic:
- Fetch the target URL with a 5-second timeout
- Parse HTML for Open Graph tags, e.g.:
`html
<meta property="og:title" content="How to Bake Bread" />
<meta property="og:image" content="https://example.com/bread.jpg" />
<meta property="og:description" content="A beginner's guide..." />
`
- Return structured metadata or fallback values
- Cache results for 24 hours using Redis or similar
If the site blocks scraping or returns no OG tags, we fall back to showing just the domain (e.g., youtube.com).
Data Types
Here’s what the data looks like on both sides:
interface LinkPreview {
url: string;
title?: string | null;
description?: string | null;
image?: string | null;
siteName?: string | null;
favicon?: string | null;
}
interface AdditionalLink {
url: string;
type: 'PromotionalUrl' | 'RedirectUrl';
}Full Flow Overview
Here’s the complete lifecycle of a link preview:
┌──────────────┐ ┌──────────────┐ ┌──────────────┐
│ Client │────▶│ Backend │────▶│ Target URL │
│ │ │ │ │ │
│ linkPreview │ │ Fetch + Parse│ │ <meta> │
│ (url) │◀────│ OG tags │◀────│ og:* │
└──────────────┘ └──────────────┘ └──────────────┘
│ │
▼ ▼
Show skeleton Cache result
then render (24h TTL)Each step is optimized for speed, safety, and scalability.
Key Challenges & Trade-offs
Like all features, link previews come with trade-offs:
- Extra backend request per link: Solved with aggressive caching.
- Slow external sites: Mitigated with timeouts and fallback UI.
- Some sites block scrapers: Handled gracefully—show domain only.
- OG tags missing or incomplete: We display what’s available, defaulting to basic info.
These edge cases are rare, but planning for them ensures a resilient user experience.
Final Thoughts: What We Learned
Building link previews taught us a few key lessons:
- Never underestimate CORS. What seems like a simple fetch request can be a hard blocker in the browser.
- Shared caching matters. One backend response can serve millions—don’t waste that.
- Clients aren’t trustworthy fetchers. They’re constrained, insecure, and inefficient for this job.
- Fallbacks are essential. The web is messy; design for failure.
By moving URL fetching to the backend, we created a secure, fast, and consistent preview system that scales beautifully.
And best of all? Influencers don’t have to lift a finger—just paste a link, and the preview appears like magic.
That’s the kind of UX worth building.