Saturday, March 21, 2015

Lessons From Prediction Competition

What did we learn from predicting song popularity?

A recent data science competition focused on whether one could predict the success of a Taylor Swift song from less than half a second of audio.  One sample sounded like:




The objective was to predict whether that clip (and some 20k others) came from a popular song, or not.  As mentioned in a previous post this is a case where there aren't descriptive elements that "make sense" - no genre, gender, demographics, contact history, or anything that marketers typically rely on to help them understand what is going on.  Just 3,300 numbers - for the audio among you, down-sampled to 11k.

The winning models got to above 99% accuracy which sounds a bit to good to be true. While technically correct there are some interesting lessons to be learned.

First, by rethinking the problem as a segmentation problem rather than individual estimates the results got a lot better. That is, by grouping clips based on their similarity the accuracy improved - this is no different than targeting audiences as opposed to specific individuals.

Second, simple models tended to work as well as complex. In this case, accuracy mattered so effort was put on improving that as much as possible. But there are times when good enough is, well, good enough.  It turns out that a simple model of "how similar is this clip to its nearest neighbor" worked very well. With this challenge, that makes sense, pop songs are similar across their 3-4 minutes. After the competition I tried some really, really simple (and possibly stupid ideas) and did better than my official submission. Don't over think it.

Third, we needed to take a step back and align technique with problem.  The same data and the same logic resulted in different results based simply on the approach taken. Always use two or more methods.

A big thanx to Devin Diderickson for posting his approach and thought process.  (He finished second, I finished sixth).

Marketing Analytics Requires Judgment

What advantage do marketers have over machines?

There are lots of cool examples in the use of math, models and machines to predict, classify and recommend.  Some can be found on econsultany's site. Within the domain of marketing analytics there is the application of machine learning whereby we are simply trying to discern cues about our business from the data.  It works, we can predict content you might like on NetFlix or recognize a motorcycle from a car (most of the time). But it is hard to do; NetFlix offered a million dollar prize if someone could improve on their approach by 10%.

Why is analytics hard?

Computers can't see.

The more complex the problem and/or approach, the more time we spend on working out how not to generate spurious results. Something as simple as correlation needs human oversight. Here is one of my favorites from a great collection of spurious findings.

It appears that Nicholas Cage movies and drowning in swimming pools go hand-in-hand.



Then there's the one about the decrease in honey bee colonies tracking with the rise in juvenile arrests for marijuana.  Really?

Computers lack what seasoned marketers bring to the table - judgement. At present a lot of the work in the "big data" space is focused on how to best capture human knowledge in the analytics. In image recognition for instance how do we find that there are in fact cats on YouTube if we don't there are cats to begin with?

In short, we are trying to find ways to learn what we don't know we need to learn. This unsupervised learning can be a lot like kindergarten recess where we need an adult on the playground to keep us safe.

Thinking and judgement will never go out of style.

Always ask: Does it make sense?


Tuesday, March 10, 2015

Linking Instagram, Pinterest and Houzz to e-commerce

What products do my pictures suggest?

This is a continuation of an idea that arose from a conversation in a bar around the question: "How does content work?"

Recommendation engines typically work with data from one site. We've all seen examples of "people also bought..." on web sites. But an interesting idea would be to link my curated content (Instagram, Pinterest, or Houzz) to the shopping experience of an e-commerce site.

To illustrate the idea, here's a widget from Houzz narrowed to bedrooms:


Assuming for a second that is my idea book, I would appreciate the shopping site recommending the second bed below rather than the first.




To make this a reality requires a couple of things. First, we need access to both sets of pictures - this is a perfect use case for a widget. The benefit seems obvious: "Log on to your x account and we'll do the heavy lifting of sorting thru a couple of hundred thousand items for you."

Second, we need to identify what the pictures contain or portray. This argues for auto-tagging of images to classify and assign attributes. It is unlikely that traditional work-flow processes can handle this task because consistency across two businesses is required. For this to work we need the same method of classifying two sets of pictures (mine and theirs) to a common set of standards. The good news is there are web services for doing this on the fly. Although, if I were an e-commerce site or travel site I'd pre-process my pictures as part of the editorial/content management process.

Third, we need to define something that allows us to rank the pictures. And that means we need a measure of similarity. In market basket analysis it is easy - count the times each pair of products appears in the shopping cart and pick the most common. In this case though, all we have is a list of tags for each picture. In this case there are approaches to similarity ranging from a count of the shared tags among the whole set to computing how many steps are required to convert one set to the other.

The web services I looked at both give a 'confidence' score about the tags. This numeric value can be used to filter tags (clean out those we're not sure of) or be used to weight the tags. This gives the solution a new dimension and also helps to remove judgment from the picture.  

Next up. Working to get my own idea book and curated content classified and do some more experimentation.

Monday, March 09, 2015

Marketing Measurement: Lessons from Physics

How should we measure our efforts?

There is nothing new about the idea of measuring marketing. What is changing however is out ability to do so. Yet, we're not there yet.


Metrology, or the science of measurement, requires three things:
  1. An internationally accepted unit of measure.
  2. How to realize those units in practice, i.e just what is a meter?
  3. Chains of traceability between a measurement and a standard
In the series "The Science of Measurement" Marcus du Sautoy describes the history and approach taken to nail down the seven fundamental things to be measured. (And I paraphrase the physics.)
  • Time - the second used to be a fragment of a celestial cycle and now is 9B+ spins of an atom; we've moved from personal time zones (there were thousands in the US before the transcontinental train) to highly precise coordination. What good would Snapchat be without a common sense of expiry?
  • Distance - starting out as the length of the pharaoh's arm and moving to 1 ten-millionth of the distance between the North Pole and the Equator it is now how far light travels is a very short period of time. Imagine an inaccurate GPS. 
  • Mass - the weight of pure water, at sea level and at freezing point has been the reference point, this is the last measurement to based on an artifact of something else rather than a fundamental law of physics. Media impressions have been often described in "tonnage" to reflect reach and frequency. 
  • Moles - measures how much stuff is involved without worrying about mass or weight. It makes conversions easier to understand and handle. For instance, two parts hydrogen + one part oxygen equals one part water. Gross/Target Rating Points are one way we try to standardize across channels. 
  • Light - possibly the easiest to grasp and yet most peculiar, light is responsible for what we see and is defined in waves. The challenge is that our eye adapts to light creating two types of measurements - energy and "in the eye of beholder." What we observe in marketing is altered by both our observations and our biases. 
  • Heat - is defined by how fast something moves with absolute zero being the absence of movement. Temperature is simply a measure of activity.  Word of mouth and influence would be analogous to positive heat. Poor customer service would be negative due to the friction it creates.
  • Electricity - is all about the flow of stuff from one place to another thru time and space (lightning strikes for example). As evidence of the use of fundamentals, heat can be measured by electricity - the oven probe does it with a thermocouple. The customer journey and path-to-purchase deal with the flow of people with lots of insulators, leaky funnels and conductors.
A key point of the series is that with every increase in the precision of measurement comes a leap in technical disruption with new capabilities emerging.

In marketing, Ashu Garg of Foundation Capital recently wrote a nice white paper on the "Decade of the CMO" in which he articulated the seven most important metrics in marketing.
  1. Marketing ROI
  2. Customer Experience
  3. Conversion Rate/New Customers
  4. Overall Sales
  5. Marketing-influenced Sales
  6. Revenue-per Customer
  7. Social Media Metrics
While all good, it would seem we are still in the middle ages of marketing measurement. There are no precise or accepted ways to define many of those metrics on unassailable reference points. Just what is an "experience" or an "influence"?   Even concepts like "sales" can be fuzzy concepts to nail down within one company let alone across companies.

Another takeaway from the series is: figuring this out isn't easy or quick. It often takes standing on the shoulders of giants to make progress so expect this to evolve over time. In the short term, we run the risk of measuring what we can, not what we need. It will take dedication from those who think differently to crack this nut.

Which leaves us with a fundamental question:

What are the building blocks of consumer choice?

Tuesday, February 24, 2015

Marketing Challenges: Updated

What's changing in marketing?

I stumbled upon a piece in McKinsey's insights section entitled "The Changing Face of Marketing". The author outlines six factors impacting marketing.

  1. Customer - "the end users of almost every company’s products are shifting in makeup, location, and number at an ever-increasing rate."
  2. Insights - "If knowledge about future customers is essential, and if the quality of the marketing output is materially affected by the caliber of the informational input" then we need more.
  3. Technology - "the computerization of many areas of marketing is only a matter of time"
  4. Testing - "more controlled experimentation to narrow the odds of an error in making marketing changes"
  5. Sales as marketing - [the] "job is becoming less and less the presentation of the company’s product line, more and more the marketing of integrated systems."
  6. Global - the problem is "determining how to provide most efficiently the marketing services needed—services that in many companies today are directed, if not executed outright, by a central corporate staff."
This summarizes what we often see in the marketing services world everyday.

It seems that the only thing new is history we haven't learned yet. The article was written in 1966. 

Tuesday, February 17, 2015

Predicting Music Popularity and Marketing Analytics

What do Taylor Swift's videos have to tell us about marketing?

There's a machine learning competition running here in Salt Lake City that deals with predicting the popularity of Taylor Swift videos from snippets of music. And by snippets I mean 1/10th of a second of a song. The goal is to classify whether a song is a 'success' or 'not' (if you can take 30m views as not being successful and 300m as success.)

Source


In marketing we try to understand what makes a consumer tick, but in this case an audio track has been converted to a series of numbers that look something like the following. Now imagine 3,300 columns of numbers for each sample...

  Success     V1        V2      V3        V4       V5      V6      V7     V8        V9
       0       -5787    9566    -511    -2274 18589   1170    2232   5073   -2578
       1        1067   -  521    1524     -209    -957    -777  -2666     716   -3273
       1          397     -314   -1701   -2568    -463   1123    1041     916      789
       1          784   -1017    2038     4134   1453    3731   3644    2759  -2033
       0     -19632 -13344 -14428 -13195  -3940  -1306  -1206   -1303  -1256

This is where you have to trust the machine and the team. There will be no conference room debate as to what impacts success - the math just works. And that raises an important question: how do we now improve the consumer experience and deliver on the brand promise?

Keeping with the music theme, a musicologist can deconstruct the signals into themes and styles much in the same way that Pandora was created from a music genome. As a result the experts won't be just the ones who can apply a Restricted-Boltzmann-Machine to a classification problem, but those who can translate the results. Interpreters will be in as much demand as the coders.

Keeping my Rosetta Stone polished.