Heidi Thorne is an author and business speaker specializing in sales and marketing topics for coaches, consultants, and solopreneurs.
This year, I’m doing another run of my biennial self-publishing survey. The goal of the survey is to determine what self-published authors are currently achieving financially, as well as what motivates and challenges them in publishing their own books.
Though I’ve gotten some great insight, here are some of the challenges and tips I've discovered doing research with an online survey.
The Purpose of Your Survey
As with all goals, figuring out the why for your survey is the first thing that needs to be done. Your business’ online survey could attempt to get feedback, opinions, and information on a number of items, including:
- A new product, service, or project you’re considering.
- Customers reactions to new logo redesign ideas, changes in service, etc.
- Triggers that encourage customers to buy (or not).
- Demographic data (age, gender, income, etc.) to help define who your market really is... but be careful with privacy and go anonymous!
- Insight into what’s happening in your market or industry that could impact your business or future.
Your Burning Question
A burning question for you that doesn’t seem to have a widely known answer can be a great starting point for creating your online survey.
For my burning question, I knew what other self-published authors and writers in my personal network were achieving and struggling with, and I wanted to see if their experience was common across the industry.
While there were some reports that track aggregate, industry-wide stats for self-publishing and traditional publishing, the number of reports that actually asked authors themselves how much they make were few. Some created estimates extrapolated from industry-wide stats. One was biased, seeking survey data to show the need for book marketing (which, of course, the surveying company offered).
That was what drove me to do this independent study. I always feel that best way to find out what is happening is to ask people.
Choosing an Online Survey Platform
Currently, I use the SurveyMonkey online survey platform. While there are other platforms to consider, I chose this one because of its reputation as easy to use for both marketers and survey takers. As well, they have controls for protecting data and have an anonymous function (which I absolutely needed).
I also like that I have the option to create tracking links to determine where participants are coming from. For example, for this year’s study, I set up links for three sources: Facebook, Instagram and a catchall link for every other source that would not be a big driver of traffic. Knowing the biggest sources for the last study (the number one driver was Facebook), I was able to concentrate advertising and promotions to optimize costs.
SurveyMonkey also has a free version for smaller surveys and marketers. I also like that I can purchase upgraded features on an as needed basis, as opposed to having to pay a monthly fee forever.
While this platform was a choice for me, shop around for features and pricing that fit your budget and purposes. Items to evaluate include:
- Participant data protection.
- Pricing and feature structures that scale upward or downward as needs change.
- Ease of use for both you and your participants.
- Trackable links for each source (social media, blog, etc.) to determine most effective sources of participants.
- Support documentation and methods (email, phone, help articles, etc.).
- Reporting features that help you analyze your data.
One big caution: To preserve your participants’ privacy and more easily track your data, don’t build a survey on your website or with unsecure online documents/wikis. Use an established online survey platform, even if it costs you a few bucks!
Promoting and Advertising the Survey Invitation
Getting your survey invitations to potential participants can be difficult unless you have a massive and responsive email list. I don’t have that, and most of my following is via social media. So most of my survey invitations were done via social media through both free and paid (boosted and/or sponsored) advertising.
I also collaborated with a few relevant thought leaders in the self-publishing and small business space, asking them to share the survey promotion on their Facebook or other social profiles. Some were not interested in sharing. I completely understood since they could have seen it as a conflict with their own efforts.
I even did direct outreach through private messages to selected authors I’m connected with on social media. Most did not respond to my message, and there was no bump in surveys taken for the days I did this outreach. They may have been fearful of doing the survey (even though I assured them it was anonymous). Or they were put off by the fact that I had the audacity to connect with them via private message. Or they were just too busy.
Plus, I had the issue of no incentive working against me. There’s no reason for them to participate other than to help the industry or me.
Survey Links to Determine Promotion Source
On the Survey Monkey platform for free surveys, you are currently allowed up to three shareable links. These links track the sources of survey participants so that you can determine where most of your survey traffic is coming from.
For example, I set up three links for my 2016 survey: Facebook, Twitter and everything else (email newsletters, LinkedIn, etc.). Facebook was far and away the biggest driver. Twitter only generated about 10 percent of participants versus 50 percent for Facebook. So when I did the 2018 survey, I put a lot more investment in Facebook promotions, both free and paid, even though I’m not personally that active on the platform.
Run Advertising Experiments to Save Costs
In terms of hard dollar costs, I could have spent a whole lot more on promotions such as Facebook ads. But at a cost of up to $5.00 (sometimes more) per survey participant acquisition cost (determined by dividing total spent on ad promotions by number of surveys taken during ad run), I had to be conservative to preserve resources.
Your costs could be lower or even higher than mine. Before spending on Facebook ads, or any other paid advertising options, thoroughly understand how to use the ad platform. Also, run some small ad experiments so you can get a feel for how your promotion and market will behave before spending more. Advertising costs can escalate quickly if parameters are not properly set.
For example, I found that most of the participants were responding to the ads from Friday through Monday. Therefore, I concentrated my advertising runs during the weekends.
The Other BIG Survey Costs
Ad expenses don't even take into account the literally hundreds of hours I have spent and will spend on organizing, analyzing, and preparing the findings to share with the world.
As a reward for collecting, analyzing and publishing results as a service to the publishing community, I hope to recoup my investment through increased podcast and web traffic, maybe some more email subscribers. The survey itself is not a moneymaker, even though I will be selling the full report and analysis on Amazon as an eBook for those who want it. Otherwise, all the results will be shared with the industry for free through my podcast, video channel, and other content as appropriate.
Set a budget for your survey and understand how it will benefit your business, both in terms of hard dollars and marketing opportunities.
Getting People to Participate and the Impact of Incentives
Because I was going to be asking some questions about money and wanted to preserve participants’ anonymity, I did not use the invitation to the survey as an invitation to join my email list or receive an incentive. I felt they would be concerned about providing this information if they thought I could link their email address to their response (even though that would have been difficult to do!). So it was impossible for me to offer an incentive that would be delivered privately and directly to participants.
As I was doing keyword research for this article, I was amazed at the number of search queries for making money by taking surveys. This situation raises even more questions. If an incentive (money, free product, etc.) is offered, will the participants be authentic and relevant? Will their answers provide any true insight? Or will it merely tell you how appealing your incentive is? What this highlights in my mind is how difficult it is to get people to participate in survey research.
Side note: Some survey platforms, including SurveyMonkey, have paid features which help you get more survey takers. Whether they are paid or unpaid participants will depend on the platform or service you use. However, it’s likely that some form of incentive is used to recruit these people.
Since I was providing no incentive, participants were those who were either genuinely interested in participating in this research, or they were interested in supporting my work. This no doubt results in low participant numbers. It also raises the concern that this would not be a representative sample of the greater self-published author population.
However, I’d rather err on the side of getting genuinely interested parties, regardless of motivation, as opposed to freebie and cash seekers who could seriously skew my findings with bogus information. True, even genuine participants could lie to make themselves feel better. But since their motivation for participating was for more than quick cash, the chances are likely better for getting more accurate self-reports.
Give Yourself and Your Participants Time
In both years’ surveys, I started in the mid-summer and set a deadline for the end of November so that I could analyze and publish results by early January. Even given that the survey was active for about 4 months, it took that long to get even a reasonable sample of people to join in. Allow enough time to get your data.
How Many Survey Participants Do You Need?
The short answer to that question, is the more, the better.
Because of my niche topic, lack of incentive and possible resistance to sharing information, I ended up with 86 author participants in 2016. Not bad considering all the factors, but not what I would have liked. The number of participants in 2018 about doubled (163). Even then, I wish I could have gotten more.
There are survey sample size calculators online (including two on SurveyMonkey). You just input the following pieces of data:
The total number of people, not the total number in the sample. Sometimes getting the population size is difficult because it’s just not known.
For my self-publishing survey, self-published authors are not tracked by systems such as the Bureau of Labor Statistics and many authors pursue this as a side hustle. After researching a variety of relevant metrics, I “guesstimated” a population size, eventually plugging in about 2 million. Is that right? I just don’t know. But as I played with the sample size calculators, I noted that anything over 1 million seemed to suggest the same sample sizes as several million. So I didn’t investigate and estimate further, and targeted the sample size for a total population of 1 million.
A easy way to explain this would be to fill in this sentence, “I am ___ % confident that the average (income, weight, or whatever measurement) of the population is ___ based on my sample data.” A common target confidence interval is 95%, but confidence intervals generally range from 80% to 99%, with the choice depending on the parameters of the research being done.
Margin of Error
This expands on the foregoing statement with, “I am ___ % confident that the average (income, size, or whatever measurement) of the population is ___ based on my sample data, plus or minus __% (or other exact measurement, such as dollars or pounds).” As you might expect, the lower the margin of error, the better, ideally as close to zero as possible.
What these calculators attempt to provide is a sample size that will have statistical significance for your study. However, as I have found, it may be difficult to get your target sample size. In those cases, you’ll have to deal with what you’re able to collect, but make notations about the limitations of your findings. You can play with these calculators to figure out various confidence intervals and margins of error.
For my recent study, I used the SurveyMonkey calculators. I wanted a confidence interval of 95%, with a 5% margin of error. But with the smaller sample size I achieved, I would only be able to get a margin of error of between 8% and 10%. Not a big deal since this is not life and death research as would be for testing medical treatments. But those margins of error percentages would be noted in reporting.
Survey Questionnaire Design Issues
There are some best practices for survey questionnaires.
How Long Should Your Survey Be?
According to a stat on the SurveyMonkey site, about 10 questions is ideal for a survey. Plus, that’s the limit for the free survey service level they offer. As well, I think that’s about my personal tolerance limit for a survey. Too many questions can result in people dropping out of the survey before finishing it.
Even with that low number of questions, some of them being easy-to-answer (“What is your gender?”), I found it took participants about 3 minutes to finish it.
Avoid Leading Questions: Just the Facts
Especially in online surveys, keep your questions as easy to answer as possible and avoid leading questions. Stick to the facts.
Will You Allow Skipped Questions?
When I first did surveys, I made all responses required in the hopes of getting more completed surveys. However, what happened was that if people didn't want to answer this or that question, they just dropped themselves out of the survey completely.
While, yes, requiring that all questions be answered did help get some complete surveys, I did miss some responses that could have been valuable for some analysis. For example, I included a couple of attitude type questions. Even if the participants didn't answer the rest of the survey, I would have gotten some insight on their attitudes which could stand alone, apart from the other analyses I did.
On the second go-round, I allowed question skipping. I still got 79 percent completion rate. Not bad.
Don't Jump to Conclusions
We’ve all seen outrageous clickbait “4 out of 5 doctors recommend” type headlines making claims that are too good or too crazy to believe.
You’ll often see in academic research that researchers are slow to make definitive statements about their studies, particularly concluding that a causal relationship exists where it may not. And you should follow suit.
All you can do is report what you found and offer your opinions and interpretations, noting any limitations of your research. Example: In my 2016 survey reports, I said that a certain percentage of surveyed authors made this or that income on their books. Note that it was surveyed authors.
Also, including some info about the survey is recommended to give context and to cite the source. Example: In publishing infographics of my 2016 results, I referenced the data source and where to find more information.
This content is accurate and true to the best of the author’s knowledge and is not meant to substitute for formal and individualized advice from a qualified professional.
© 2019 Heidi Thorne
Heidi Thorne (author) from Chicago Area on January 11, 2019:
Linda, sometimes I wouldn't have thought of these topics either... until I bumped up against the issues while trying to accomplish something else. Thanks so much for your support and hope your New Year is going great so far!
Heidi Thorne (author) from Chicago Area on January 11, 2019:
Liz, Survey Monkey is so popular, and with good reason. It's so easy to use for everyone. Thanks for stopping by. Hope your New Year is off to a great start!
Linda Crampton from British Columbia, Canada on January 07, 2019:
You write about topics and details that I’ve never thought of before, which I appreciate. Thank you for creating another interesting and very useful article, Heidi.
Liz Westwood from UK on January 07, 2019:
Your first hand insights are very useful on surveys. I have come across Survey Monkey many times myself.
Heidi Thorne (author) from Chicago Area on January 04, 2019:
Hi Pamela! Thanks for the kind words. True, there are a lot of details to consider when doing a survey. Happy New Year!
Pamela Oglesby from Sunny Florida on January 02, 2019:
This is a great article to let us know all the possible overlooked criteria to develop a good survey to find the results you are looking for in any given year.
Heidi Thorne (author) from Chicago Area on January 02, 2019:
Bill, you never know when you might want to get a pulse on your audience. Let me know if you ever do one and how it goes.
Thanks for your support of my survey! (I'll be contacting you shortly about the results.) Have a great day!
Heidi Thorne (author) from Chicago Area on January 02, 2019:
Flourish, we've all seen the worst of surveys! Ugh. I think the worst ones are those where they're aiming for a specific answer. Violates all survey research protocol!
I do like SurveyMonkey. It is a very usable, cost-effective and flexible platform.
Thanks for chiming in with your career experience! Have a great day!
Bill Holland from Olympia, WA on January 02, 2019:
Sticking this in the Heidi File; I'm sure I'll use it down the road. Happy New Year, my friend. Have a tremendously successful 2019.
FlourishAnyway from USA on January 01, 2019:
People often vastly underestimate how much work should go into building a good survey. You’ve provided good pointers here. With a background in I/O Psych this is right up my alley. I often try to just ignore online surveys because they are often so badly constructed. SurveyMonkey is a solid recommendation.