Yes – you obviously clicked the link.

In all seriousness, the answer is a little murky. There are some indicators that question headlines might entice a click, but there remains a lack of consensus about the tactic. As with any content strategy, uncovering what works best requires the process of testing and measuring. Then testing and measuring again… and once more. It’s a constant cycle.

In this blog post, I sift through the evidence and provide ways to help you determine whether your content needs more questions.

What does the research show?

Content strategists constantly weigh the best practices for writing. This involves assessing the performance of past content and leveraging that information to anticipate what will resonate with readers. At the same time, researchers are taking a more academic approach to determine the efficacy of various tactics.

Researchers at BI Norwegian Business School in Oslo, Norway, ran a study directly comparing declarative headlines against those asking questions. The test ran on Twitter and a Norwegian shopping site that resembles eBay. The study found that question headlines gained an average of 150 percent more clicks. When paired with “self-referencing cues” (think of the word “you”), this average jumped to 175 percent more than declarative headlines.

This is a big difference! But the researchers point out that the study has its limitations. According to a Marketing Land article, the results might not be replicated outside the study’s shopping context and there is a potential for culture bias (for example, what works in Europe might not in Asia).

As with all academic research, more studies will have to be conducted. So, what are we seeing in the field?

How about question headlines on the Voxus blog?

At Voxus, we run a blog covering topics like public relations and marketing. And we see our fair share of website traffic that ebbs and flows with each post. For this post, I peeked at the past traffic.

Auditing a sample of Voxus posts, question headlines received on average 62 percent more clicks than their declarative counterparts. While receiving more traffic, this did not reveal a significant difference in post views overall. In other words, the finding does not suggest that we increase our volume of question headlines. Other variables more likely influenced these posts’ readership – whether employees shared them to social, the topics, publishing time and more.

While serving as a potential trend indicator, this doesn’t clearly address the impact of question headlines.

But what does this all mean?

Here is the answer to whether question headlines work – maybe.

While the blog audit and academic research point to increased traffic, the effectiveness of question headlines still appears up in the air. Whether you should embrace this tactic really depends on what resonates with your audience.

As such, you should test various content approaches and see how readers respond.

How should I test question headlines?

The website BuzzFeed is a notorious A/B tester. It pulls the levers on content—finding the winning combination of headlines, lengths and thumbnail images—until it can get tens or hundreds of thousands of views. The viral sensations Refinery29 and Upworthy also A/B test their headlines and have seen a 5 to 6 percent increase in clicks as result.

Seeing what A/B testing can do, give different tactics a try and measure the results. You might be surprised by what works. You can even use the WordPress plugin Optimizely to test headlines specifically. It enables users to input variations of a headline to uncover what drives the most clicks and shares.

Until you find the best formula, keep asking what writing tactics might work and measure the results.