Best Practices

Improve Facebook Ads ROI with the Conversions API | Census

Jeff Sloan
Jeff Sloan October 15, 2021

Jeff is a Senior Data Community Advocate at Census, previously a Customer Data Architect and a Product Manager. Jeff has strong opinions on LEFT JOINs, data strategy, and the order in which you add onions and garlic to a hot pan. Based in New York City.

In this article, you'll learn how to find the right people at the right time and improve Facebook Ads ROI with ML training and Facebook's Conversion API. We'll cover:

  • Why you should use the Conversions API to improve your Facebook Ads ROI
  • How to configure your Facebook Ads pixel so you can also use the Conversions API
  • How to model your data for the Facebook Ads Conversions API
  • How to import your conversions into Facebook Ads using a reverse ETL tool

Bad news: If you're only using the Facebook Ads conversion pixel to track the success of your campaigns, you may be spending more than you need on Facebook Ads. Ad blockers and expiration dates on third-party cookies (e.g. via Apple’s ITP) prevent Facebook Ads from learning which types of customers respond best to your ads campaigns.

To help Facebook build the best profile of prospective customers to target, you need to provide it with the most complete training dataset possible. Otherwise, Facebook will waste your budget experimenting on different target audiences until it has sufficient data to begin targeting more effectively.

Good news: The Facebook Ads Conversions API lets you train Facebook faster (and at a lower cost).

The Conversions API, formerly known as the Facebook Server-Side API, allows you to supplement Facebook’s incomplete account of conversions via the pixel, with the data you’ve collected in your systems of record--like orders created, forms submitted to your CRM, or new sign-ups from your database. This API is so powerful that Shopify built out a native integration with Facebook to help their customers optimize their marketing campaigns.

Before we dive into the nitty-gritty, let’s take a look at some usability considerations for this API.

Audience targeting and the Facebook Ads Conversions API

Facebook uses machine learning models to display ads to people who are most likely to engage with your ad campaigns. Before Facebook’s ML targeting, your first and last option lay in manual, fine-grained audience targeting. Not only was this time-intensive, but it often involved a lot more trial and error.

Thankfully, Facebook’s ML targeting outperforms elbow grease methods, instead of learning on our behalf … if it’s given a sufficient signal of conversion events. Facebook requires a sufficiently large amount of conversion events to identify the best-fitting audience for your ad campaign.

If you’re a Shopify eCommerce website, you can take advantage of a native integration to the Conversions API to help ensure that Facebook has the best signal to train on. But what about the rest of us mere mortals without Shopify’s largesse? We’re going to need to integrate with the Conversions API in some other way.

If you’ve ever worked on a team wrestling Facebook’s APIs in the past, you probably have a knee-jerk negative reaction to this method. I don’t blame you. Historically, Facebook’s APIs have been notoriously complicated, confusing, and prone to change. Teams have even gone so far as employing full-time engineers just to troubleshoot Facebook API changes, upgrades, and errors.

And Facebook doesn’t make keeping track of these APIs any easier. Facebook maintains similar-sounding APIs for similar-sounding purposes (e.g. the Facebook Offline Conversions API). While this article discusses the Conversions API, the Offline Conversions API performs a subset of Conversions API functionality,  … but is only preferred for tracking physical store sales (and can’t supplement “online conversion events” that you are already tracking with your conversion pixel). If you’re not careful, you may end up building an integration with the wrong API for your goals.

Fortunately, working with Facebook’s APIs has gotten a whole lot easier.

You can now use reverse ETL (extract, transform, and load) tools like Census to take advantage of the Facebook Conversions API, no engineering favors required. Reverse ETL tools remove the headache of building and maintaining your own pipelines, so you can focus on using the data instead of moving it around. Plus, if you have any questions on the behavior of the specific Facebook API you’re working with your vendor can lean on their experience wrestling those APIs first hand to keep you in the loop.

OK, now onto the good stuff. To get the most out of this guide, you’ll need the following:

  1. A Facebook Ads Account running ad campaigns
  2. A cloud data warehouse. We recommend Snowflake
  3. ETL pipelines that load the conversion data from your systems of record into your cloud data warehouse. We recommend Fivetran
  4. A reverse ETL tool like Census

Let’s dive in.

How to configure your Facebook Ads pixel tracking

To supplement the Facebook Ads training dataset accurately, you’ll need to ensure that you’re not duplicating conversion events that have already been sent via the pixel. By sending certain optional parameters via the pixel, Facebook can use those parameters to de-duplicate data coming from both the pixel and Conversion API.

Specifically, you must explicitly set the pixel event’s event ID so it corresponds to another system’s identifier (e.g. the “order_id” or the “lead_id” in your internal systems), or you must set an external ID value with this type of identifier instead. If you implement the former method, Facebook will use this event ID, alongside the event parameter (the event’s name) to discard all duplicate events. If you implement the latter, then you will only deduplicate conversion events under certain conditions.

Facebook provides guidance on both methods in their documentation here, but the TL;DR is:

  1. (Recommended) Deduplication based on the pixel’s event ID will ensure events sent via both the pixel and the Conversions API will never be duplicated. Only the first version will be kept, provided that the subsequent browser or pixel event is sent within 48 hours of the first event with the same event ID.
  2. Deduplication based on an external ID value only works to prevent duplicates caused by events sent first from the pixel and then through the Conversions API. To check on the external ID values sent via your browser pixel, look for the Pixel Helper tool under Advanced Matching Parameters Sent.

After you ensure you're tracking the required parameters for Facebook Ads to deduplicate events sent via the Conversions API, you are ready to start identifying the conversions that you wish to import into Facebook Ads.

How to model your data for the Facebook Ads Conversions API

Next, you’ll need to identify the conversions in your data warehouse that you’d like to send over to Facebook Ads. Using this data, Facebook will supplement its view of success and augment its targeting accordingly.

Facebook Ads will use the data you provide for the following three activities:

  1. Reporting correctly. The conversion requires parameters to land correctly in the desired reports, and be counted for the right conversion event at the right time.
  2. Ads attribution. With the user details provided, Facebook can attempt to make a match to a Facebook users’ ads engagement.
  3. Deduplication. If sending this conversion data via the pixel as well, Facebook can deduplicate the conversions, provided that the data sent via API is within two days of the pixel tracking the conversion event.

You’ll need the following data in the results of your query:

Note: Your Pixel ID can be found here .

Your final query will look something like this:


SELECT 1234 AS pixel_id,
	orders.order_id AS event_id,
	'website' AS action_source,
	created_at AS event_time, -- this must be GMT
	'Purchase' AS event_name, -- this must match with the pixel
	customers.email,
	customers.first_name,
	customers.last_name
FROM orders
LEFT JOIN customers ON orders.customer_id = customers.id

And your dataset will look like this:

pixel_id

event_id

action_source

event_time

event_name

email

first_name

last_name

1234

5678

website

2021-01-01

00:00:00

Purchase

john@smith.com

John

Smith

You now have your data to supplement the view Facebook acquired via the conversion pixel. 🎉

How to import your conversions into Facebook Ads using a reverse ETL tool

Finally, you’ll upload this data into Facebook Ads using the Conversions API.

You can certainly build your own integration to the Conversions API, using the Business SDKs that Facebook provides to developers. For regular conversion uploads, production-ready pipelines to Facebook Ads will need to run on a schedule, respect API rate limits, and handle API errors gracefully. Production use cases require logging and alerting to troubleshoot conversions that the API rejects (e.g. when the data doesn’t meet its requirements), and the ability to send those records again after making a fix.

With Census, this step is as simple as taking all of the data you’ve identified from your data warehouse and mapping it to the relevant fields in Facebook Ads. Click “sync”, set the integration to run on a schedule, and your data will be on its way to Facebook regularly. Census will log errors and alert you when the API rejects any conversions you need to investigate.

Harness reverse ETL better train your Facebook Ads campaigns

You can improve the ROI of your ads campaigns by increasing the volume of conversion signals you send to Facebook Ads. Using your own homegrown integration or a tool like Census, you can leverage the Facebook Ads Conversions API to improve the chance that your ads are being shown to the right people at the right time, which means a greater chance of conversion. 🙌

Want to improve your Facebook Ads targeting by supplementing Facebook’s pixel data, or are you curious how to implement this approach at your own company? Please don’t hesitate to book a call with us here to speak more about it. Or, if you’ve found a better way than reverse ETL, I’d love to hear about it.

Related articles

Customer Stories
Built With Census Embedded: Labelbox Becomes Data Warehouse-Native
Built With Census Embedded: Labelbox Becomes Data Warehouse-Native

Every business’s best source of truth is in their cloud data warehouse. If you’re a SaaS provider, your customer’s best data is in their cloud data warehouse, too.

Best Practices
Keeping Data Private with the Composable CDP
Keeping Data Private with the Composable CDP

One of the benefits of composing your Customer Data Platform on your data warehouse is enforcing and maintaining strong controls over how, where, and to whom your data is exposed.

Product News
Sync data 100x faster on Snowflake with Census Live Syncs
Sync data 100x faster on Snowflake with Census Live Syncs

For years, working with high-quality data in real time was an elusive goal for data teams. Two hurdles blocked real-time data activation on Snowflake from becoming a reality: Lack of low-latency data flows and transformation pipelines The compute cost of running queries at high frequency in order to provide real-time insights Today, we’re solving both of those challenges by partnering with Snowflake to support our real-time Live Syncs, which can be 100 times faster and 100 times cheaper to operate than traditional Reverse ETL. You can create a Live Sync using any Snowflake table (including Dynamic Tables) as a source, and sync data to over 200 business tools within seconds. We’re proud to offer the fastest Reverse ETL platform on the planet, and the only one capable of real-time activation with Snowflake. 👉 Luke Ambrosetti discusses Live Sync architecture in-depth on Snowflake’s Medium blog here. Real-Time Composable CDP with Snowflake Developed alongside Snowflake’s product team, we’re excited to enable the fastest-ever data activation on Snowflake. Today marks a massive paradigm shift in how quickly companies can leverage their first-party data to stay ahead of their competition. In the past, businesses had to implement their real-time use cases outside their Data Cloud by building a separate fast path, through hosted custom infrastructure and event buses, or piles of if-this-then-that no-code hacks — all with painful limitations such as lack of scalability, data silos, and low adaptability. Census Live Syncs were born to tear down the latency barrier that previously prevented companies from centralizing these integrations with all of their others. Census Live Syncs and Snowflake now combine to offer real-time CDP capabilities without having to abandon the Data Cloud. This Composable CDP approach transforms the Data Cloud infrastructure that companies already have into an engine that drives business growth and revenue, delivering huge cost savings and data-driven decisions without complex engineering. Together we’re enabling marketing and business teams to interact with customers at the moment of intent, deliver the most personalized recommendations, and update AI models with the freshest insights. Doing the Math: 100x Faster and 100x Cheaper There are two primary ways to use Census Live Syncs — through Snowflake Dynamic Tables, or directly through Snowflake Streams. Near real time: Dynamic Tables have a target lag of minimum 1 minute (as of March 2024). Real time: Live Syncs can operate off a Snowflake Stream directly to achieve true real-time activation in single-digit seconds. Using a real-world example, one of our customers was looking for real-time activation to personalize in-app content immediately. They replaced their previous hourly process with Census Live Syncs, achieving an end-to-end latency of <1 minute. They observed that Live Syncs are 144 times cheaper and 150 times faster than their previous Reverse ETL process. It’s rare to offer customers multiple orders of magnitude of improvement as part of a product release, but we did the math. Continuous Syncs (traditional Reverse ETL) Census Live Syncs Improvement Cost 24 hours = 24 Snowflake credits. 24 * $2 * 30 = $1440/month ⅙ of a credit per day. ⅙ * $2 * 30 = $10/month 144x Speed Transformation hourly job + 15 minutes for ETL = 75 minutes on average 30 seconds on average 150x Cost The previous method of lowest latency Reverse ETL, called Continuous Syncs, required a Snowflake compute platform to be live 24/7 in order to continuously detect changes. This was expensive and also wasteful for datasets that don’t change often. Assuming that one Snowflake credit is on average $2, traditional Reverse ETL costs 24 credits * $2 * 30 days = $1440 per month. Using Snowflake’s Streams to detect changes offers a huge saving in credits to detect changes, just 1/6th of a single credit in equivalent cost, lowering the cost to $10 per month. Speed Real-time activation also requires ETL and transformation workflows to be low latency. In this example, our customer needed real-time activation of an event that occurs 10 times per day. First, we reduced their ETL processing time to 1 second with our HTTP Request source. On the activation side, Live Syncs activate data with subsecond latency. 1 second HTTP Live Sync + 1 minute Dynamic Table refresh + 1 second Census Snowflake Live Sync = 1 minute end-to-end latency. This process can be even faster when using Live Syncs with a Snowflake Stream. For this customer, using Census Live Syncs on Snowflake was 144x cheaper and 150x faster than their previous Reverse ETL process How Live Syncs work It’s easy to set up a real-time workflow with Snowflake as a source in three steps: