ClassicPress SEO

SEO for classic press is like any technical SEO within a CMS, you have to work with the best practice and find the best plugins.

The ClassicPress CMS has WordPress at its core so is search positive with a few caveats.

Plugin stability and the chance of options for SEO in core are the two paths we can look at.

What’s Running Under the hood here:

  • Yoast – Using only its XML generation settings, Yoast is not a plugin to use without the SEO be bloat, its highly unlikely to keep working in CP
  • SEO De-bloat
  • WPBakery Page Builder

What an SEO Plugin or Core CMS Should Have

A Global View It should cater for:

  • Google
  • Bing/Yahoo
  • Baidu
  • Yandex
  • Naver

This list is not endless but there is a lot more out there than Google and where possible all can be factored in, they in the main bar Yandex and Baidu have very similar methodologies.

Encourage privacy

Anything aligned with Mozilla and Duck Duck Go works and makes sense, still profitable and merchantable too. !g bangs via DDG for example give SEO info for free, normally hidden in the .com Google search.

On a Per Page Basis It Should Allow

  1. Maintain URL slugs and connect well to a permalink setup, WP has a good example of this, its one of the CMS greatest features where SEO is concerned.
  2. Title and description editing, in this case Yoast has a great tool you can see a few folks built similar ones too, Yoast did not invent this idea, but a SERP preview tool has been around for a while.
    • Some tools to review are ScreamingFrog and its desktop Spider
    • SEMRush and how it presents SEO issues like this in a dashboard view for simple fixes, this can be factored into an SEO dashboard once it uses a simple onsite logic and is not feeding data off to a third party API, most title and descriptions check are based on length, you can use simple character counts or render the SERP then measure pixels, both are not 100% accurate so effort here creating complex checkers has minimal workflow impact, this is a manual check a user makes with the search engine.
    • The CMS can do a lot here but you cannot replace manual checks and paid third party tools. You can give a very good best practice set of guidelines via simple to use tools in the back-end.
    • This amounts to 3-4 text input fields.
  3. Keywords are still niche in Russia I believe and Yandex may still use them. Its not a tag to consider in a new SEO tool, but anything legacy that supports it should keep it, if only for nostalgia.
  4. I think adding an open ended header insert idea works here too and allows a lot of flexibility, the following are all added under SEO and are just simple meta tags:
    • Webmaster tools verification
    • www.schema.org or any Search Enhancements code like site search, etc
  5. XML generation and a link between a page and the sitemap, if you want to de-index a page or post this is the core need here, this has meta implications also, but its arguable if noindex should be factored now into a new SEO or CMS settings for SEO.

Site-wide Settings or Holistic SEO Needs

  1. Taxonomy index control, this needs to be a high priority many CMS will generate portions or parts that are section of pages, archives, search results, pagination, categories, filters, tags and umpteen other things that are not unique content. Only pages and posts should index, and also products where WooCoomerce is running. No tags, categories or author pages should even load on 99% of brochure small business web sites.
  2. Robots.txt generation could be a simple once off that creates a better version of the default, this could be optional too, its a simple txt file but the first port of call for all ethical search spiders
  3. Overall site www.schema.org – This is a technical thing but the current plugins try to sugar coat it. The code is not hard to add, it has no issues i.e. if you add bad Schema nothing will break on site, its invisible to most users. Most folks could use a simple text filed with some advice, I always aim this at a person who can write basic HTML, catering technical SEO tools to non technical folks makes little sense.
  4. Favicons now appear in SERP so are technically something to consider for SEO, the image can improve CTA in searches, it will for sure look bare without it.

End Results

With SEO setup or technical SEO it is always better to look at the rendered site on a live domain, this is the real test. The following are present on this site using ClassicPress and this is 100% of what a site needs to be technically correct and as optimized as it can be.

  • Un-truncated titles and descriptions
  • Indexed in Google with correct HTML etc the copy from the CMS translates to the SERP correctly € signs etc and & are correctly added to the HTML
  • A sitemap_index.xml with the correct taxonomies you want indexed, your site: search should show 100% of what you want indexed and only that
  • A robots.txt file blocking or instructing, also linking to the XML sitemap mentioned in the previous point
  • Schema.org & Social share markup
    • Sitesearch linked to /?s= a single line of header code
    • Social media linkages and replication of the titles and descriptions via <og> tags for facebook and the twitter embed code. This is a simple transform of the info sent to title and description
  • Other Schema and other meta, as it emerges seems to be focused outside the header, in ways SEO will be wrapped in the markup we use over time. Some themes are using this now and Schema is very multifaceted, but for the last few years simple code additions are all the major SEO plugins are doing, there is no magic to this.

This Site’s Index

Currently as you can see from the screen shot the site indexes well, I have done very little Title tag optimization but out of the box ClassicPress plus a few common plugins has good results.

XML Sitemap Index & Robots.txt

It’s a simple file but important for SEO. Equally Google uses this less and less, you also need to factor a ping into any SEO tool or practice a weekly check of GSC, advising users to manually do this creates less automation and patterns. Manual seems to be seen as white whereas the more automated any submission to a search engine a solution becomes the closer to black it is in SEO terms. Multiple backlinks bought vs natural organic links is a prime example.

Site Health

SEMRush is one of the better tools, none are perfect. Always review code and learn how to review it. HTML is a simple script there may be a lot of it in a page but you can parse the SEO specif portions with a little searching in most text editors or code views in a browser. The 0-100% rating SEMRush gives is glib but in the main pretty accurate.

At the time of this scan i had not started any server side compression so most of the errors relate to one small cPanel change.

Rankings Minute 0

Keyword Rankings

SEMRush again here and a very decent day one rank, this will grow like any fast well tagged site, with good content.