Search Engine Optimization: Proven Strategies for Nonprofits in 2019

According to a recent survey by Zogby Interactive, the Internet is by far the most popular source of information and the preferred choice for news ahead of television, newspapers, and radio. The majority of Americans now prefer the Internet as their primary and most reliable source of news. Specifically, online publications are preferred over social media sites such as Facebook and Twitter.

If your mission is to influence opinions, the web cannot be ignored. You must achieve an effective online presence to be a part of conversations that matter. To discover content online, two sources dominate today; social media and search engines, specifically, Facebook and Google.

Publishers need to understand search engine trends to stay relevant

Google and Facebook dominate referral traffic to nearly all news sites today. As dynamic tech companies, Google and Facebook and constantly tweaking their algorithms, so online publishers need to stay informed.

Because the exact algorithms used by search engines and social media sites are secret and ever-changing, a mythology has arisen around the field of Search Engine Optimization or SEO. There are numerous online debates between “white hat” and “black hat” SEO “experts” who recommend and criticize technical tricks to improve search rankings. Because search engines and social feeds have the power to entirely kill most online businesses and publishers, an adversarial attitude dominates the thinking about SEO strategies.

Content is still king in online publishing

The reality is that content is still king. Creating interesting and relevant content that people want to watch or read is still by far the most important factor in the success of a website. The goal of search engines and social networks is not to destroy independent publishers or ruin businesses, but to provide value to their users by showing them the most relevant, reputable, and quality content. Content is key, but it needs to be organized and presented in a way that is easy for search engines to find it, understand what it’s about, and assess its quality.

Google and Facebook want to be your partner, not your adversary

It’s critical to understand that Google and Facebook want to tell you exactly how to be successful on their platform. Their need for secrecy on the details of their algorithm comes mainly from the need to deter malicious actors that attempt to get more traffic than the quality of their content merits. Google and Bing will tell you exactly how they see your website and suggest how to improve your search visibility. They want to work with publishers to promote higher-quality content and rewarding experiences for users.

As a publisher, Google’s Search Console and Bing’s Webmaster Tools is your most valuable asset for improving search performance. Furthermore, on-site content changes (“onsite SEO”) usually can have far more impact on search performance than external tweaks to search engines and link-building (“offsite SEO”).

Consider using Google’s native tools before hiring an SEO consultant

You know your own content best and can directly edit it (unlike a paid SEO consultant). So if improving search traffic is your goal, consider starting with Google’s free tools and educational materials first. Content is king, so an investment in the SEO expertise of your editorial/content/communications/marketing team can be far more valuable and long-lasting than a one-time SEO audit.

Content strategy is more important than technical tricks

There are many lists of “SEO tips” out there, but it is more important to understand the essential factors that influence search performance. When a visitor types a search into Google, asks their AI assistant a question, or just unlocks their phone, Google looks at the universe of content in its index and asks two questions to decide what to serve up first:

1) How relevant is your content?

Google’s goal is to see content as a human being would, both visually (in terms of how content is laid out on a page) and contextually (grokking what the page is about and parsing the most subtle differences between similar content to understand what is most relevant).

Google’s code has gotten so good at this, that as an editor, you don’t need to worry about targeting specific keywords but can target concepts instead. For example, you don’t need two have two pages on “traffic jams” and “highway congestion”—Google knows they refer to the same thing, and to the extent that they are different, it will figure out which keyword is more relevant for a specific search. There is no need to stuff your content with keywords.

What you do need to do, is accurately describe what your content is about, specifically in the title of your page, and description meta field. The three fields shown to readers in search results are the page title, meta description, and URL—and they need to accurately and uniquely summarize the page content. Your page title, summary, and URL should contain all the necessary information needed for Google and a search user to decide how relevant your article is.

You should use semantic headers (like h1,h2,h3) and structured data to provide additional context for classifying content, such as the location of an event or the author of a recipe.

2) How authoritative is your content?

Besides the topic of content, the second major ranking factor is the authority.  Google scores every web page in its index with PageRank: an algorithm which measures importance by the number of incoming links, and the importance of the websites with those links. Both domains and individual pages have an authority score. To an extent, the authority of a domain is distributed among its pages by their relative importance, and so is the authority of each link on a given page: the more links there are on a web page, the less authority each one carries.

To indicate that a page’s content is authoritative, develop partnerships with other authoritative sites, as well as internal links to important content on your own site.  A single link from a major publication like the Wall Street Journal, CNN, or a prominent research organization can carry huge weight, though subsequent links are far less valuable.

Google uses both the content outgoing links from a page to measure the quality of content.  Content should be written by policy experts and cite reputable sources, such as research organizations and well-established consensus courses.  Expertly written and well-cited content is also the best way to earn organic links.

Content clusters: a proven strategy for building authority and relevancy

Users, search engines, and publishers create an ecosystem that evolves in response to each other.  The two most important changes over the last decade have been the shift to mobile devices (most searches are now on mobile) and the shift from single keyword searches to directly asking natural language questions – over a third of queries are now over four words.  For example, whereas in 2006, I might have searched for “weed killer”, in 2019, people will search for “How do I kill weeds in my yard?” and show content on related terms like “lawn care” “healthy grass” and others.

To adapt to this change by users and search engines, publishers have responded by creating content on relatively broad topics and organizing that content in a way that tells search engines that they cover many specific aspects related to the broader topic, and that searchers would find their content relevant.

This specific search engine optimization strategy is known as pillar pages and content clusters. The way it works is that a publisher creates content for a relatively broad topic — say, “running shoes” — beginning by publishing several articles that each cover a specific aspect of the broader topic — say, “cross country running shoes,” “track and field running shoes,” and the like. These articles are known as “cluster content.” The publisher then creates condensed versions of the cluster content and compiles them together into a single article that is then optimized for the broad topic (in this case, “running shoes”). This article is known as a “pillar page.”  We’ve written an article on this for more details on how we implemented the pillar page strategy on FEE.org.  Visits to FEE.org pillar pages now comprise the majority of our organic and paid search traffic:

How we optimized FEE.org for SEO: 2015-2019

Working with editors to write better metadata

When we built the current FEE.org content management system in 2015, supporting rich metadata for social media and search engines was important in the design:

Incorporate the document type and title in URL

For example https://fee.org/articles/without-strong-property-rights-free-trade-wont-help-south-africa/

or https://fee.org/about/annual-reports or https://fee.org/books

Both visitors and search engines can see what a page is about from the URL alone.  By the way, each piece of content has a unique canonical tag even if it’s used in multiple places.

First-class social and search engine metadata editing

When editors enter an article in the CMS, they must first enter complete metadata that tells Facebook and Google how to display the content in link share previews and search results:

Use tagging for human and machine learning

In addition to metadata, each document has a set of “public” and “private” tags.  Public tags are shown to the users, whereas private tags only render in the page’s metadata.  The tags are used by our content management system to show related articles, and my our publishing analytics platform (Parse.ly) to show analytics to editors.

Modern search engines do not recognize tags directly, but they do see tags as hyperlinks, which helps them relate articles about similar subjects.

Parse.ly tag analytics:

Showing content for a specific tag, as well as related tags:

Pilar pages on FEE.org

As explained above, we build content clusters that center around pillar pages.  For example, if you search Google for “What Caused the Great Depression?” you will see this FEE.org in the first page of results:  https://fee.org/articles/what-caused-the-great-depression/

This is a pillar-page (aka cluster page) template. It’s similar to an article, but it uses a wider cover image and a table of contents to account for it’s longer length.  A pillar page should be a comprehensive guide to a particular topic.

Using tags to dynamically link related content

How can we attract readers to authoritative pages? FEE.org has over 60 thousand pieces of content written over 75 years, so it was not possible to edit a substantial number of them to create content clusters.  Instead, we used the existing tags to auto-link articles to each other.

First, we defined 300+ tags with a “definitive” link for each tag:

This allows a link to be automatically added whenever one of these keywords is used in an article tagged with that keyword:

We chose to do this manually so that we could use our SEO skills and familiarity with content to chose the best and most authoritative article, book, or video for each tag.

Auto Linking Content

Furthermore, any link to another article or resource on FEE.org generates a link preview when visitors hover over it:

This encourages editors to create additional internal site links.

Using machine learning

Under each article on FEE.org, a “further reading” section used both the full text of the article and previous browsing behavior of individual visitors to suggestion related content.  We do this by submitting article metadata to Parse.ly and then querying its machine learning API.

Here are two more ways we use API’s to retain visitors:

The “Latest Stories” bar up top and the “Infinite Scroll” of recent stories is below.  The “Latest Stories” only shows when the user scrolls up. This is meant to capture “exit intent” and provide additional content when the visitor is about to leave the pate.   The “infinite scroll” is intended to provide a catalog of further reading when the visitor is finished reading the current article.

Together, these changes drove a dramatic decline in the on-site bounce rate:


Summary

  • It’s important for publishers to be aware of best practices for search engine optimization
  • Writing great original content for humans is more important than technical wizardry
  • Great metadata descriptions are key for helping readers discover great content
  • Clusters of related and interlinked documents are far more powerful than a single story
  • It is important to both link to and be linked by respected authorities on the topics you cover
  • Use technology to improve the user experience, but don’t try to hack” Google’s algorithms with Google-specific tricks
  • Content and experiences that work best for real people are usually the best strategies for search engines as well.

Leave a Reply