The Tim Ferris way of testing ideas and how I did it.

  • There is an ocean of difference between 14% of visitors committing their email address to something, and that same figure actually pulling out their credit card.

    A more prudent assumption would be 10% of that 14% will actually pay for your product when it is launched.

    I learned this first hand. Your mileage may vary of course, but it's incredibly easy to drive traffic and stimulate interest, enough that people will give you their email address or whatever other personal information you want. The moment you ask them to pay though, you're in another world in terms of conversions :) This MVP-type google ad testing doesn't really translate that well into real world data like purchase conversions. I don't think that's the point anyway.

    It's good for seeing what people might be interested in, which is often something that startups aren't sure of :)

  • That seems hard to believe. I did the same testing strategy with my Job applier software (http://fastjobapplier.weebly.com/) and first of all I only got around 30 ad clicks total, and no one filled out the form. (I was thoroughly disappointed.)

    Granted maybe I just suck at sales copy, or perhaps there just isn't demand out there for software that helps you apply for jobs.

    Still though, a 10%+ conversion rates seems too high.

    Do you find you're still getting a similar conversion rate now that you're actually selling the product?

  • I've been itching to try this for some time. I would appreciate any war stories HNers could share.

  • Here's another (cheaper, although possibly less realistic) approach to this: I recently did some market research using Amazon's Mechanical Turk, and added a lead collection form as a bonus.

    I put up a simple survey asking users about relevant background information and their experience with the problem my application is trying to solve. Included in this survey was a field to collect the user's email address. This field was very clearly marked as optional (it even appeared after the confirmation code that allowed turkers to complete the "HIT") and the label included something like "Your answers to this survey will help us to evolve the website to better meet your needs. If you'd like to be notified when the new site launches, please enter your email address here." along with an indication that users would receive exactly one email from us due to this form.

    I was pretty pleased with the results of the survey--I received a lot of actionable information very quickly and cheaply (at about $0.10 per response) and I was pleasantly surprised that a little more than 10% of the respondants entered what looks like a valid email address.

    I don't expect many of those 10% to convert (and lead generation wasn't the point of the excercise anyway) but I was very happy with the ROI on this survey.

    Seeing who clicks on ads (and at with what copy) and later "soft" converts may be a more realistic test, but the MTurk approach is an order of magnitude less expensive.

  • Actually, the figures here may be pessimistic. It takes a certain kind of person to submit their e-mail address when there is quite obviously no deliverable being explicitly promised to them for immediate purchase & download. If there were one, there might have been higher conversion.

  • I don't get it. So you "sell a product", including a 30 day trial, but once the user signs up there's no product?