Product News

Census Sync: What's new in October 🎃

Sean Lynch
Sean Lynch October 26, 2021

Sean is co-founder and Chief Product Officer (CPO) at Census. He loves enabling data-driven organizations, so he's energized by introducing the world to Data Activation. San Francisco, California, United States

At Census, one of our product values is Just works™. This month (besides decorating our office for Halloween) we’ve focused on living up to this goal, and improving smaller chunks of the end-to-end sync experience with new features such as automatically adding new fields to syncs, improving our sync configuration UX, and adding more customization to scheduling and alerting.

This isn’t Boris, we promise.

🚀 Read on to learn about what we added to the product in October.

Set and forget: Automatically add new fields to syncs if they appear in the model

This one’s been a long time coming. You can now select “Sync all properties” when setting up your sync, and Census will automatically create any new field added to your model or table as a new field in the destination. You can even choose the naming case for the field name you want Census to create (e.g. UPPERCASE, CamelCase, etc).

Note: This is only supported for destinations that allow creating new fields via Census. See all supported destinations on the docs.

More powerful sync field configuration

If you’ve created or edited syncs lately, you’ve probably noticed some changes in the user experience. We're excited to announce that you can now customize sync behavior down to the field.

For update and create syncs, you can choose to customize sync behavior per field. For example, you can choose to only set a field value if the existing field is empty, rather than always overriding the existing value.

We've also flipped the mapper UI so database columns are on the left and destination fields are on the right to make it easier to understand the data flow.

👉  We want your feedback! Ping us via chat, on Slack or shoot us an email with your thoughts.

Avoid alert fatigue: Set thresholds for invalid record alerting

In addition to choosing to receive alerts for invalid or rejected records, you can also specify the threshold you want to be alerted on so you don't miss anything important.

This alert will now default to on for all syncs if over 75% of the records from the source are invalid or rejected by the destination.

You can set this trigger threshold to be lower, or even any, for those syncs you want to keep an extra eye on.

👉 Read the docs to learn more on alerts in Census.

Create syncs programmatically via POST /syncs API

We’ve extended the Census API to support creating syncs programmatically.

👉 Read the API docs to learn more.

Customize your schedule down to the exact hours and days with Cron

Want to schedule syncs to run hourly on weekdays only? Or how about once every four hours?

You can now specify your sync schedule with incredible granularity using a Cron expression.

👉 Read the docs to learn more.

Cancel running syncs

You can now hit pause during an actively running sync and choose to cancel the job. This will also pause the schedule so the sync won’t attempt to run again until you hit “Resume”.

S3 (New destination!)

Schedule secure data exports from your warehouse to CSV files in your S3 buckets. Read the docs to learn more.

SFTP (New destination!)

Send data from your warehouse to your SFTP server. Read the docs to learn more.

Pipedrive (Updated!)

Added support for deal and notes objects. Read the docs to learn more.

Braze (Updated!)

Added option to null mapped fields instead of deleting full records when using Mirror sync for the Braze user object. This makes it easier to manage audience attributes in Braze without worrying about API rate limits. Read the docs to learn more.

Mixpanel & Amplitude (Updated!)

Added support for Properties Bundle passthrough. Rather than mapping individual fields, you can now just pass a JSON object of properties directly through to Mixpanel or Amplitude (and coming soon for other event data destinations!). Read Mixpanel or Amplitude docs to get started.

❤️As always, a big thank you to you, our Census Champions, for all your feedback and for inspiring us with all you do with data. If you’ve made it this far, you can claim your free Census Champion swag pack.

Which one of these updates are you excited to try? Contact us with any questions or feedback, or get started in Census today.

See you next month for more. 🚀

Related articles

Product News
Sync data 100x faster on Snowflake with Census Live Syncs
Sync data 100x faster on Snowflake with Census Live Syncs

For years, working with high-quality data in real time was an elusive goal for data teams. Two hurdles blocked real-time data activation on Snowflake from becoming a reality: Lack of low-latency data flows and transformation pipelines The compute cost of running queries at high frequency in order to provide real-time insights Today, we’re solving both of those challenges by partnering with Snowflake to support our real-time Live Syncs, which can be 100 times faster and 100 times cheaper to operate than traditional Reverse ETL. You can create a Live Sync using any Snowflake table (including Dynamic Tables) as a source, and sync data to over 200 business tools within seconds. We’re proud to offer the fastest Reverse ETL platform on the planet, and the only one capable of real-time activation with Snowflake. 👉 Luke Ambrosetti discusses Live Sync architecture in-depth on Snowflake’s Medium blog here. Real-Time Composable CDP with Snowflake Developed alongside Snowflake’s product team, we’re excited to enable the fastest-ever data activation on Snowflake. Today marks a massive paradigm shift in how quickly companies can leverage their first-party data to stay ahead of their competition. In the past, businesses had to implement their real-time use cases outside their Data Cloud by building a separate fast path, through hosted custom infrastructure and event buses, or piles of if-this-then-that no-code hacks — all with painful limitations such as lack of scalability, data silos, and low adaptability. Census Live Syncs were born to tear down the latency barrier that previously prevented companies from centralizing these integrations with all of their others. Census Live Syncs and Snowflake now combine to offer real-time CDP capabilities without having to abandon the Data Cloud. This Composable CDP approach transforms the Data Cloud infrastructure that companies already have into an engine that drives business growth and revenue, delivering huge cost savings and data-driven decisions without complex engineering. Together we’re enabling marketing and business teams to interact with customers at the moment of intent, deliver the most personalized recommendations, and update AI models with the freshest insights. Doing the Math: 100x Faster and 100x Cheaper There are two primary ways to use Census Live Syncs — through Snowflake Dynamic Tables, or directly through Snowflake Streams. Near real time: Dynamic Tables have a target lag of minimum 1 minute (as of March 2024). Real time: Live Syncs can operate off a Snowflake Stream directly to achieve true real-time activation in single-digit seconds. Using a real-world example, one of our customers was looking for real-time activation to personalize in-app content immediately. They replaced their previous hourly process with Census Live Syncs, achieving an end-to-end latency of <1 minute. They observed that Live Syncs are 144 times cheaper and 150 times faster than their previous Reverse ETL process. It’s rare to offer customers multiple orders of magnitude of improvement as part of a product release, but we did the math. Continuous Syncs (traditional Reverse ETL) Census Live Syncs Improvement Cost 24 hours = 24 Snowflake credits. 24 * $2 * 30 = $1440/month ⅙ of a credit per day. ⅙ * $2 * 30 = $10/month 144x Speed Transformation hourly job + 15 minutes for ETL = 75 minutes on average 30 seconds on average 150x Cost The previous method of lowest latency Reverse ETL, called Continuous Syncs, required a Snowflake compute platform to be live 24/7 in order to continuously detect changes. This was expensive and also wasteful for datasets that don’t change often. Assuming that one Snowflake credit is on average $2, traditional Reverse ETL costs 24 credits * $2 * 30 days = $1440 per month. Using Snowflake’s Streams to detect changes offers a huge saving in credits to detect changes, just 1/6th of a single credit in equivalent cost, lowering the cost to $10 per month. Speed Real-time activation also requires ETL and transformation workflows to be low latency. In this example, our customer needed real-time activation of an event that occurs 10 times per day. First, we reduced their ETL processing time to 1 second with our HTTP Request source. On the activation side, Live Syncs activate data with subsecond latency. 1 second HTTP Live Sync + 1 minute Dynamic Table refresh + 1 second Census Snowflake Live Sync = 1 minute end-to-end latency. This process can be even faster when using Live Syncs with a Snowflake Stream. For this customer, using Census Live Syncs on Snowflake was 144x cheaper and 150x faster than their previous Reverse ETL process How Live Syncs work It’s easy to set up a real-time workflow with Snowflake as a source in three steps:

Best Practices
How Retail Brands Should Implement Real-Time Data Platforms To Drive Revenue
How Retail Brands Should Implement Real-Time Data Platforms To Drive Revenue

Remember when the days of "Dear [First Name]" emails felt like cutting-edge personalization?

Product News
Why Census Embedded?
Why Census Embedded?

Last November, we shipped a new product: Census Embedded. It's a massive expansion of our footprint in the world of data. As I'll lay out here, it's a natural evolution of our platform in service of our mission and it's poised to help a lot of people get access to more great quality data.