This document outlines 13 secrets to successful metrics-based marketing that experts don't always discuss. It argues that testing and data alone do not guarantee success - you need to know when to test, what to test, and how to interpret results. A structured, recursive process is recommended to prioritize what to test, develop hypotheses, analyze outcomes, and continually refine understanding based on new data. Marketing must be integrated with creativity, business strategy, and an obsession with both high-level positioning and low-level details to most effectively guide product development and messaging.
2. Page 2
Modern Marketing:
Data, Data, Data
• Modern marketing revolves around testing
and metrics.
• Direct Marketing conquers the world!
• We love Dave McClure; we’re ‘Pirate
Marketers’
• “But”…
3. Page 3
Portia Says In
The Merchant of Venice
“If ‘to do’ were as easy as ‘to know what
were good to do’, chapels had been
churches and poor men’s cottages
princes’ palaces.”
4. Page 4
The Gurus Don’t Tell You…
There are 13 secrets to turning this theory into practice.
• Gurus tell us to build systems designed to accommodate
failure (“Fail Faster!”) but they don’t tell us how to.
• They tell us how to think but not what to do.
• If it were just a case of “test & improve based on the
numbers”—then every site would turn into Facebook!
5. Page 5
#1: Having A ‘Testing+Metrics Hammer’
Doesn’t Make Everything A ‘Nail’
• Just because you have a hammer, doesn’t
mean everything is a nail.
• There are times to use testing & metrics—
and times not to.
• Misused, you will test the wrong things,
misinterpret data, make bad decisions.
6. Page 6
Example: A Startup That
Wanted To Hire Us
• A luxury travel startup. VC funded.
• They bought into the measurement worldview.
• Built a beautiful website—with their product
hidden.
• Spent thousands of dollars testing whether 4 or 6
minor video icons on the bottom of the front page
yielded more conversions.
7. Page 7
#2: Premature Optimization
Is The Root Of All Evil
• So sayeth the Great Donald Knuth
• You invest a lot of time & energy optimizing the wrong
thing.
• In general, it’s difficult to know if testing is premature.
• You need a process to determine when to test or not to.
• Google’s easy optimization infrastructures thus
encourage evil!
8. Page 8
#3: Don’t Make Decisions Via Testing
• When you can’t make a decision, it is tempting say “we’ll test it.”
• Therefore, often testing is a cover for indecision.
• A test reveals how good an idea is. A test result is not a decision.
• Example:
– We lowered the prices of a client’s product (an eBook)
– Sales & profitability improved slightly
– We then made the strategic decision to raise the price drastically
– That made much more profit!
• The small optimization didn’t help as much as the drastic change.
9. Page 9
#4: Tests Must Test Hypotheses
• Good testing only tests hypotheses. If it doesn’t test a
hypothesis, you can’t learn—so it’s a waste.
• Your testable proposition must be based on general
theories you create. Theories about…
– The Product
– The Marketing
– The Human Nature
• Must include a falsification process: can be disproved
10. Page 10
#5: Every Good Test Is Actually
A Test Of Your Worldview
• Your worldview underlies everything—so every
worthwhile test tests the assumptions about human
nature embedded into your product and marketing.
• A color test really tests human nature: are girls more
attracted to green than guys? Why?
• Ask yourself: what views of human nature are embedded
into everything we’re doing? Test these.
• Sometimes data changes our worldview. As we’ve done
more testing, we’ve become more cynical!
11. Page 11
#6: API To Creativity
• Art must supplement science.
• We need to apply massive, structured creativity at every point.
• We build APIs to Creativity: Rigorous, Directional Creativity.
• This includes, designing workflow software into which our creative
talent can insert the voice, style, tone at the appropriate points.
• Data can’t be a substitute for applied creativity.
• The most creatively demanding tasks: coming up with theories,
hypotheses, extracting meaning from data.
12. Page 12
#7: You Need Structured
Obsession Over The Small Details
• These little details matter—a lot.
• You must be obsessed with questioning the marketing of every detail
• Why? Because customers are AMAZING at spotting bullshit.
– If you say you’ll do anything for your customers but then you have a 15
page T&C where they sign away all rights—they’ll realize it.
• Everything—from the smallest vocabulary choices upwards.
• We call this, “Embedded Marketing.”
• You need to do this for both high-level and low-level details.
• Since there is so much to obsess about, we need a structure.
• Without a consistent message, the marketing is wasted.
13. Page 13
This is How Obsessive You Need
To Be: High-Level Details
Some High-Level details that we obsess over:
• What core emotion does it appeal to?
• What beliefs about human nature underlie our product?
• How does it make people want to tell their little sister about it?
• How will the product itself get people so excited that they want to come back
TOMORROW? (Because if there’s too much of a delay before a user returns, it
can’t catch on. That’s the Evite problem.)
• How will the cynical, snarky journalists react to this detail? And other non-target
demographics?
• What style, tone, voice, words convey the right implications?
• What is the buy cycle for this product?
• Should we give the customer hints to leave something to figure out?
• Above what price point does thought begin?
14. Page 14
This is How Obsessive You Need
To Be: Low-Level Details
Some Low-Level Details that we obsess over:
• Will the contact page have a phone number, email, or web form?
• What city should the address on the contact page be in?
• Should our email addresses take the form of firstname@domain.com
or firstname.lastname@domain.com or … ?
• What should be the final line of this email we need to send out?
• Will the new user sign-up form contain 3 or 4 fields?
• Should the company Twit or should individual employees? Or both?
Or—gasp—neither?
• What size, color, style should the “Register” button be?
15. Page 15
#8: Inferring False Implications
• Extracting the wrong meaning from data—particularly,
inferring false implications—is common.
• For example: You probably think that SuccessMetric
#1 implies SuccessMetric #2. So if users add products
to the shopping cart, they’re interested in buying it. But:
maybe they prefer your site as a price comparison engine
• Avoid the false dilemma of only two options: “Hypothesis
X is false, therefore counter-hypothesis Y must be true.”
16. Page 16
#9: Combine Buy Cycle
Positioning With Metrics
• The testing and metrics can’t be separated from the
potential client’s position in the buy cycle.
• People who are ready to buy will click on different pages,
use different keywords, everything, than people just
beginning their research.
• So we must define and judge SuccessMetrics with
reference to their position in the buy cycle.
• Example: we need a different email signup goals, for
users just starting to consider buying vs those near the
point of purchase.
17. Page 17
#10: High-Level Best Practices
When you think Testing & Optimization Best Practices, most think Low-Level:
“Capitalize The First Letter Of Every Word Yields More Clicks.”
Don’t focus on these until the end. Instead, focus on other types of Best
Practices:
• High-Level Best Practices
• Technological Best Practices
• Secret Sauce Best Practices: You must develop your own (we have!)
Example:
– In our method, we sell with emotional reasons early on and then logical
reasons near the point of purchase.
18. Page 18
#11: Random External Factors
• You probably think, “Today’s data is great—so
today’s tests are successful.”
• But shocking external factors effect everything:
time of day, holidays.
• But many are subtle factors no one tells you
about. Like the weather.
• (When it rains more, people stay home more—
and thus more likely to click and make us
money.)
19. Page 19
#12: The More High-Level,
The Higher The Priority
• We must make sure we don’t waste time testing small
things when there are still big things unanswered!
• Most likely, whatever you’re now considering testing
is a waste. There’s probably a bigger testing fish to fry.
• We test big things by doing small tests that are
representative of the big things.
• Remember: priorities change.
• We obsessively list every possible question, issue,
factor, marketing risk—then prioritize the list.
20. Page 20
Our Prioritization Criteria
We prioritize everything through the lowest combined score on these two axes:
These are high-level in terms of the Planning, in descending order of priority:
1. Objectives. What do we want to do?
2. Strategy. How will we do it?
3. Tactics. What are the details of the strategy?
4. Logistics. How do we make it all happen?
These are high-level in terms of the Company, in descending order of priority :
1. The Product. First the product needs to market itself
2. Sales Collateral. The sites need to excite the demo, too.
3. Outside Marketing. Outside channels (like ads) must draw qualified people to our
Sales Collateral.
21. Page 21
#13: The Process
You need a process to embody all of the previous approaches. We call
our process “Recursive Marketing,” which is:
• A structure for channeled obsession over the marketing implications
of every tiny detail of the project.
• The application of this structure to high-level issues—then driving
down to repeat the same process in increasing levels of detail.
Recursion: Calling a pattern within the pattern itself. Similar to a Fractal.
Recursion: See Recursion.
22. Page 22
Our Recursive Process:
1. Prioritize: Is this component worth obsessing over? The less important, the fewer (if
any) tests, less time spent.
2. Theorize: Create a hypothesis to test based on your beliefs about human nature,
best practices, and your creativity. Define a test & success metrics.
3. Test. Everyone else talks about this part to death so we won’t!
4. Get Data from the Test—both SuccessMetrics and AncillaryData (to find surprises)
5. Update Your Ideas. Based on the data, we must update our hypotheses, theories,
best practices, beliefs about human nature.
6. Go Back To #1 for this component (doing it differently, with your new hypotheses,
ideas, etc) until the data are good
7. Go Back To #1 on another component if there is surprising data.
8. Go Back To #1 on the next level of detail within this component.
9. Go Back To #1 on the next component of the project—for every possible component.
23. Page 23
How We Do This For Clients
1. Discovery: Free Consultation to understand the project
2. Consulting: Planning, theorizing, hypothesizing—the
first swipe at thinking through all the large and small
marketing issues.
3. Integrated Software Development: working with our
software development partner using:
• Our integrated marketing+software development methodology
• Which embodies these principles + agile programming
• Process designed for iterations
1. Launch—continued iteration and testing
24. Page 24
Conclusions.
• Testing & Metrics alone build companies like Columbia
House Records (the old music direct marketer)—not iTunes.
• Need to know what and when to test and measure—and what
and when not to.
• Marketing is fundamentally intertwined, at a minimum, with
customer development
• At a deeper level, marketing is fundamentally intertwined with
the business development
• Everything is marketing & marketing is everything
• (Spoken like a completely objective marketer!)
• The boildown of our strategy: we don’t edit; we rewrite.