Users that didn't convert
Tuesday, February 28, 2012 | 2:35 PM
Labels: best practices, conversions, data manipulation
A question frequently posed to our services teams is "I devote time and attention to learn all I can about users that convert, but what can I learn from users who do not convert?"
(Such questions regarding converting users tend to come up more among advertisers and agencies using DoubleClick for Advertisers, but even if you use Floodlight/Spotlight tags in DoubleClick for Publishers you've probably asked this same question.)
First, some ground rules about the term "non-converter". We find that the term "non-converter" in practice refers to a number of different things; deciding which is most important to you is a good place to begin answering this question. The following are groups that are commonly referred to as "non-converters":
a) Users who saw your ads and visited your website, but did not convert.
b) Users who saw your ads but did not visit your website (and thereby did not convert).
c) Users who did not see your ads but did visit your website.
Data Transfer is especially good at helping you develop valuable insights about these groups. Let's focus on the first two cases (however you can learn things about all three categories using DT). How do you identify which exposure pathways or interaction patterns are truly valuable vs. those that are not? How do you identify user interactions that tend to generate a large amount of conversions but also drive users away? What patterns can be observed from those users who do not convert? Your results may be quite surprising. (Note, the following explanation requires that you receive and process all three Data Transfer types: Impression, Click and Activity.)
Start by inventorying the user IDs from your Activity files over the exposure window of your preference, say 30 days. Next, use these user IDs to compare against those found in your Impression and Click files over the same date range (you may choose to add one additional day to your Impression and Click files to account for visits that occur right after midnight). At this point you're processing 60 days of Impression and Click files. Now, identify those user IDs occurring in your Impression and Click files that either do not have an Event-ID value of '1' or '2' (the identifier signaling a click conversion or an impression conversion) -or - that do not appear in your Activity files at all. Finally, sort the resulting Impression, Click and Activity records for each user ID in reverse chronological order by time. The conclusion is a list of your users that were exposed to and may have interacted with your ads but, for whatever reason, did not convert.
It's at this point that you can take action such as using Remarketing, better content targeting, Creative Optimization to deliver a better message, or site optimization to improve user experience. You may even opt to run a DFA Experiment which divides users into treatment and control segments for comparison of ad effectiveness.
--Matthew Trojanovich and Ryan DeVito, Data Transfer Team
Thursday, November 17, 2011 | 5:23 AM
Labels: data manipulation
Google BigQuery Service: Big data analytics at Google speed
(Cross-posted on the Google App Engine Blog the Google Enterprise Blog and the Official Google Blog)
In a previous post we mentioned the Google BigQuery Service. Development of this service has indeed progressed with Google speed; since that time we've introduced a graphical user interface, have made significant improvements in accessing the service programmatically through the API, and expanded the functionality of JOIN statements. I thought you'd like to see a screen-shot.
Please note that BigQuery is currently in preview and open to a limited number of enterprises and developers. Please sign up to get on the waitlist and be notified when you can start using BigQuery. For more information, take a look at the Getting Started document.
-- Matthew Trojanovich, Data Transfer Team
(Cross-posted on the Google App Engine Blog the Google Enterprise Blog and the Official Google Blog)
In a previous post we mentioned the Google BigQuery Service. Development of this service has indeed progressed with Google speed; since that time we've introduced a graphical user interface, have made significant improvements in accessing the service programmatically through the API, and expanded the functionality of JOIN statements. I thought you'd like to see a screen-shot.
Please note that BigQuery is currently in preview and open to a limited number of enterprises and developers. Please sign up to get on the waitlist and be notified when you can start using BigQuery. For more information, take a look at the Getting Started document.
-- Matthew Trojanovich, Data Transfer Team
Google BigQuery and Data Transfer
Friday, September 16, 2011 | 11:43 AM
Labels: best practices, data manipulation
Wouldn't it be great if you could program against your Data Transfer files without ever having to first download them from DoubleClick FTP, then transform them, then load them locally and only then analyze them? Here's a sneak peek of something we're very excited about: it's called BigQuery and it aspires all that and more using the cloud and Google's APIs.
As of now, we have also integrated the Tasks API, Prediction API, and URL Shortener API in addition to the BigQuery API. You can now include these APIs in your scripts, apps, and sites pages. As with other Apps Script services, we handle all of the server communications as well as authorization, which makes this a great way to build mashups and workflows using our APIs.
To get started, simply enable the APIs you’re interested in from the "Use Google API services" menu in the script editor. [...]
Below is a snippet from a recent Google Code API post. What would it mean for your business if all of your DT data were accessible in this way?
-----
In January of this year we launched BigQuery integration with Google Apps Script. What we didn’t mention was that we were building this on top of our Google APIs Discovery Service. Thanks to the ease and flexibility of writing clients based on this API, today we’re announcing integration with three more APIs, and revamping our BigQuery support.
As of now, we have also integrated the Tasks API, Prediction API, and URL Shortener API in addition to the BigQuery API. You can now include these APIs in your scripts, apps, and sites pages. As with other Apps Script services, we handle all of the server communications as well as authorization, which makes this a great way to build mashups and workflows using our APIs.
To get started, simply enable the APIs you’re interested in from the "Use Google API services" menu in the script editor. [...]
-- Matthew Trojanovich, Data Transfer Team
NOTICE - Doubleclick Data Transfer files for 2011-08-09 should be re-downloaded
Friday, August 12, 2011 | 2:11 PM
Labels: notification
We have experienced an issue with the processing of Data Transfer files for August 09, 2011. The originally posted files may contain incorrect or incomplete data. All files have been corrected and re-uploaded to the FTP server as of 1:30 PM ET, August 10, 2011. All files downloaded prior to this time should be re-downloaded to ensure accurate data. Please note that any custom reports using Data Transfer should also be re-downloaded. Our technology teams are monitoring the system to ensure this issue does not reoccur.
We thank you for your patience and apologize for any inconveniences this may have caused.
Regards,
DoubleClick Media & Platforms Solutions
We thank you for your patience and apologize for any inconveniences this may have caused.
Regards,
DoubleClick Media & Platforms Solutions
Upgraded Match Tables
Tuesday, July 19, 2011 | 1:46 PM
Labels: DFA, DFP, match tables, Rich Media DT
We've recently updated the available Match Tables for DFA, DFP and Rich Media. If you'd like to upgrade to the "latest and greatest" we encourage you to get in touch with your Account Manager to make a request. Upgrades to Match Tables incur no cost and can usually be turned around in just a few days.
What updates have we made?
Primarily we've added new fields to existing Match Tables. We've also added a few new Match Tables and, in a small number of cases, removed defunct or rarely used fields. With the additional files and fields you gain access to new product features and improved visibility of existing product features.
Specifically, these are the changes:
For DFA - new ad_comments file, new mobile_keyvalues file, new page_cost file, new page_flight_cost file; updated fields in the advertiser, campaign, page and creative files; updated all date fields to be expressed in the same format mask as your DT events (MM-DD-YYYY); removed unused creative fields; improved format request form.
For DFP - new ad_comments file, new mobile_keyvalues file, new system_defaults file; updated fields in the advertiser, order, order_pricing, ad and creative files; updated all date fields to be expressed in the same format mask as your DT events (MM-DD-YYYY); removed unused creative fields; improved format request form.
For Rich Media - Improved format request form.
How can you tell if you are already using the new files?
You can request a copy of the current DT/MT setup form from your Account Manager. Compare the files and fields therein looking for the above changes. We tried to bundle as many changes as we could into this new MT version so that you don't have to worry about repeated MT changes. Finally, to be clear, no MT changes are made to your files without you first requesting that change from us.
We hope you find the improvements helpful to your data analysis efforts.
-- Matthew Trojanovich, Data Transfer Team
What updates have we made?
Primarily we've added new fields to existing Match Tables. We've also added a few new Match Tables and, in a small number of cases, removed defunct or rarely used fields. With the additional files and fields you gain access to new product features and improved visibility of existing product features.
Specifically, these are the changes:
For DFA - new ad_comments file, new mobile_keyvalues file, new page_cost file, new page_flight_cost file; updated fields in the advertiser, campaign, page and creative files; updated all date fields to be expressed in the same format mask as your DT events (MM-DD-YYYY); removed unused creative fields; improved format request form.
For DFP - new ad_comments file, new mobile_keyvalues file, new system_defaults file; updated fields in the advertiser, order, order_pricing, ad and creative files; updated all date fields to be expressed in the same format mask as your DT events (MM-DD-YYYY); removed unused creative fields; improved format request form.
For Rich Media - Improved format request form.
How can you tell if you are already using the new files?
You can request a copy of the current DT/MT setup form from your Account Manager. Compare the files and fields therein looking for the above changes. We tried to bundle as many changes as we could into this new MT version so that you don't have to worry about repeated MT changes. Finally, to be clear, no MT changes are made to your files without you first requesting that change from us.
We hope you find the improvements helpful to your data analysis efforts.
-- Matthew Trojanovich, Data Transfer Team
Subscribe to:
Posts (Atom)