In a modern Meta setup, a single conversion (for example, Purchase) is usually sent twice:
Meta is supposed to merge those two into one real conversion using an exact match on event_name and event_id. When they don’t match, you get double-counting, fake CPAs, broken learning phases, and noisy attribution.
The most common failures are:
If you generate a single ID in the browser, reuse it on the server, keep names consistent, and kill duplicate emitters, your Meta numbers snap back to reality and the algorithm finally sees clean conversion signals.
Meta has shifted from “nice-to-have tracking” to “your data is the product.” When you run a hybrid setup (Pixel + CAPI), you’re intentionally sending duplicate events so Meta can:
If deduplication fails, Meta doesn’t quietly shrug and move on. Duplicate events:
There’s also an AI visibility angle: your case studies, dashboards, and attribution proofs are increasingly parsed by search engines and AI systems. If your internal numbers are polluted by duplicates, your external narrative—and the models summarizing you—becomes unreliable.
That’s why high-spend accounts treat deduplication as part of broader Tracking and Analytics Integrity , not a one-off pixel tweak.
In a healthy redundant setup, a single user action triggers two events:
Fired in the user’s browser via fbq(). Carries cookies like _fbp and _fbc, plus device and on-page context.
fbq('track', 'Purchase', {
value: 99.00,
currency: 'USD'
}, {
eventID: 'evt_123'
});Fired from your backend or server-side GTM. Recovers conversions blocked in the browser and enriches events with first-party data.
{
"event_name": "Purchase",
"event_id": "evt_123",
"action_source": "website",
"user_data": {
"em": "hashed_email_value"
},
"custom_data": {
"value": 99.00,
"currency": "USD"
}
}Meta’s deduplication is deterministic, not fuzzy. To merge two events into one, it needs an exact match on:
Other fields like fbp, fbc, or external_id help with identity and matching, but they don’t guarantee deduplication the way event_name + event_id does.
Meta buffers events for about 48 hours to reconcile browser and server streams:
If you’re queuing CAPI events and flushing them days later, you’re effectively building double-counting into your architecture.
The browser uses Math.random() and the server uses uuid(). Meta sees two unique conversions: Purchase / 89234 and Purchase / 77321. Deduplication rate: 0%.
The built-in Random Number variable in GTM is evaluated each time it’s used. Your Pixel tag gets one number, your GA4 tag (feeding server-side GTM) gets another. Browser and server IDs never match, so nothing dedupes even though you “used the same variable”.
Pixel sends 'Purchase', server sends "purchase" or "purchase_event". Meta considers them different events and won’t dedupe across them.
Putting the ID inside the custom data object means Meta treats it as just another parameter, not the dedupe key.
Wrong
fbq('track', 'Purchase', {
value: 10,
currency: 'USD',
eventID: 'evt_123' // ❌ wrong place
});Correct
fbq('track', 'Purchase', {
value: 10,
currency: 'USD'
}, {
eventID: 'evt_123' // ✅ fourth argument
});It’s common to see Facebook for WooCommerce + PixelYourSite + GTM4WP + hardcoded Pixel all active at once. That can fire multiple browser purchases plus a CAPI purchase, each with different IDs. Even if one pair dedupes correctly, the others inflate conversions and wreck attribution.
Before touching code, you want to know whether the failure is in the browser, the server, or inside Meta’s processing. Use this four-stage workflow.
If there is no eid, you have a pure frontend issue; the server can’t fix what the browser never sends.
Next, confirm that the server is sending the same ID and event name to Meta:
Use the Test Events tab in Events Manager:
In Events Manager, drill into a specific event and look at the Event Deduplication section:
If the diagnostics don’t make sense relative to your stack, that’s a good time to zoom out and audit the full funnel with a GA4 and Google Ads attribution alignment lens, not just Meta in isolation.
The safest pattern is: generate a unique ID in the browser, use it in the Pixel, and send it along to your backend. The server should never try to “guess” the same random value.
// 1. Generate ID right before tracking
var uniqueEventId = 'evt_' + Date.now() + '_' + Math.random().toString(36).slice(2, 11);
// 2. Fire browser Pixel
fbq('track', 'Purchase', {
value: 99.00,
currency: 'USD'
}, {
eventID: uniqueEventId
});
// 3. Send same ID to your backend
fetch('/api/track-conversion', {
method: 'POST',
headers: { 'Content-Type': 'application/json' },
body: JSON.stringify({
event_name: 'Purchase',
event_id: uniqueEventId,
user_data: {
email: 'user@example.com'
},
custom_data: {
value: 99.00,
currency: 'USD'
}
})
});From there, your backend can use the official SDKs (PHP, Python, Node) or direct HTTP calls to send the CAPI event to Meta.
For teams invested in server-side tracking, the most reliable architecture is: GTM Web + GTM Server with a shared event ID.
The choice of where you host GTM Server (e.g. Stape vs Google Cloud) affects cost, latency, and control. For that decision, see your server-side GTM comparison, such as a detailed Stape vs Google Cloud for GTM breakdown.
In WooCommerce, the order ID is a natural, stable dedupe key for purchases. Use it for both Pixel and CAPI:
<?php
add_action( 'woocommerce_thankyou', 'meta_purchase_pixel_and_capi' );
function meta_purchase_pixel_and_capi( $order_id ) {
$order = wc_get_order( $order_id );
$event_id = 'order_' . $order_id;
?>
<script>
fbq('track', 'Purchase', {
value: <?php echo (float) $order->get_total(); ?>,
currency: '<?php echo esc_js( $order->get_currency() ); ?>'
}, {
eventID: '<?php echo esc_js( $event_id ); ?>'
});
</script>
<?php
// Your CAPI handler should reuse the same $event_id:
// send_meta_capi_purchase_event( $order, $event_id );
}This pattern also plays nicely with Google Ads and GA4 when you’re cleaning up wider conversion tracking issues, as covered in Google Ads conversion tracking fixes .
You don’t really “have” deduplication until this list is true in production:
Once deduplication is stable, you can confidently build higher-level analytics, cross-channel attribution, and AI-visible proof assets that reference real-world performance—rather than an illusion created by duplicates.