An SEO guide for the product leading CTO
Queries and Clicks
Let's get this out of the way first: Google appears to be a feedback-driven algorithm that is driven by user behavior. Everything I will propose in this post will ultimately come down to an assessment of "did you provide the user what they were looking for?" Google is in the business of providing the right answer for the question it is being asked. This Moz article does a great job of summarizing this dynamic through patent filings and casual observation. In the end, I think the paradigm is simple: if you ask google the same question twice, it learns that it's first answer wasn't good enough. Over time, this can effect your search ranking.
All SEO tactics are based on this idea - whether the person doing the SEO can trace it back to this idea or not. SEO professionals draw on years of experience of what works and what doesn't to advise their clients on how to improve their rankings. It is all about clicks and queries though - and once you can deduce that, then the SEO game becomes a bit more understandable.
SEO is almost entirely a user experience exercise. Here's some examples to show why.
Page Titles, Headers, and Content Structure
The semantics of title tags, H1s, headers, sections you name it - are long a source of consternation for people trying to get good SEO. It makes sense: a machine (Googlebot) is scraping our pages - make it as easy to understand as possible. But there's another way to look at it. Let's say you are running a recipe site and someone asks Google for "roasted beet recipes." If they get your page as a result and click through, the user will also see on a well-designed site that the page is about roasted beet recipes, and that you can learn to cook roasted beets by reading that page. Whether or not you have the text "roasted beet recipe" in an H1 or that it occurs frequently within a section tag is not of consequence to your visitor. I would gamble that a site that uses any method to highlight what the content of a page is about is likely to get good search engine traffic - replace your H1 with a div where the CSS makes the font 26pt, that will achieve the same thing to the user.
The same idea applies to accessibility. The reasoning for ALT attributes has often been that it helps Google understand your images. But in reality, it's just that it helps your users understand your images that they cannot consume conventionally. If Google provides a result which is a page full of images to someone who cannot see those images - how valuable is that page? That user is likely to seek another result by asking Google the same question, which in turn gives feedback to Google: the user didn't find that result valuable, so let's make sure we give them a better result next time.
Lighthouse and Page Speed Insights are invaluable tools provided by Google. They allow webmasters and CTOs to assess very technical issues with their web site for remediation. As with many SEO tactics, I was always curious about whether or not the tools Google provided were indicative of some master algorithm where any sites were in an arms race for the perfect score - or if these tools were a convenience that Google provided to help webmasters. I've concluded it's the latter.
If we believe that the user experience leads to the best search results, then obviously a lousy experience where a mobile user has to wait 6, 7 - even 10 seconds to consume your content, is probably going to lead to a high bounce rate. Your content can be fantastic, but if the pain the user has to get there is intolerable, they may still go back and ask Google for another result. Just the other day I asked Google for information about a shell scripting solution I've never committed to memory. The site typically having the right answer was down, so I went to another site. Google isn't measuring site reliability - it's measuring how reliable you are to your users.
The tools that Google provides to measure page speed are a convenience, part of their ethos of making the web a better place. There is no definitive "score" that changes your search engine ranking for better or worse - but if your users get fed up with your popup-ridden, slow site and go back to Google for other answers, that will impact your ranking over the long haul.
Sitemaps and Link Structures
Content discoverability makes common sense. More pageviews, longer time on site, lower bounce rate. You've just created a much better web site for your users. Obviously - if you want Google to crawl and index your content, you want to make sure you have internal linking that works. You want redirects that don't hold up your users. You want URLs that aren't opaque. Particularly for the more savvy users, knowing where they are and where they are going is an issue of trust. It's one of the reasons why Google displays the result URL with each result, and why the idea of domain authority exists.
But honestly, I'm not sure that so-called "HTML sitemaps" - pages filled with links, actually achieve anything. If you have content that is orphaned and not discoverable by users, then what value does it have? If you started tomorrow and took the mindset that every page is a landing page, wouldn't you design your site differently? You would have deep cross-linking, semantic linking to relevant content, etc. But directory pages that simply walk users 2 or 3 clicks to the page they are looking for may not be applicable. It seems likely that if a user asked Google for "clinical trials albany new york" that yes, you should have a landing page that gives that user a way to discover many trials in Albany. But if that is just a page of links, instead of a search-faceting experience where within one click the user can find what they are looking for - that's probably a negative user experience.
Domain authority is the idea that your domain name (e.g. ryannorris.com) is something Google understands and to be legit, and therefore they rank you on the basis of some idea that your domain matters. You domain does matter - just not the way that an SEO might tell you it does.
Google doesn't care about the difference between ryannorris.info or ryannorris.biz or ryannorris.name - it cares about what people searching care about. As I said earlier - Google displays the full link for the result it is displaying on the SERP for a reason: it is allowing user behavior to give it feedback on what matters. URLs are LTR (left to right) and so the first thing a searcher on Google sees is the domain. Let's imagine a search result with this page path: /blog/cooking/roasted-chicken-with-root-vegetables.html
Now let's host that page on two domains:
Which one are YOU more likely to click? To our eyes, the page path is basically irrelevant in a search for "roasted chicken" because the first things you see are the protocol and the domain. Hell - Google's mandate on SSL enabled web sites (https vs http) seems just as likely to be the result of Google having data that shows that they get more clickthroughs to sites where the displayed URL is using HTTPS versus HTTP.
It's a somewhat thorny issue. If you have registered a new domain and launched a new site, your "domain authority" is as likely a result of other marketing tactics that embrace the branding of the domain than anything else. If sweatyguy.com has run television ads that build awareness of sweatyguy.com, they likely have better domain authority because Google users have heard of the domain.
The Exceptions: Structured Data, Manual Actions, and "Questionable" Content
I'm still not sure what to make of structured data. You can be using microdata, RDF, or JSON-LD and regardless of what Google says, I've seen nothing to suggest that there is priority given based on format of structured data. My best guess is that you should use structured data because when you cross whatever thresholds Google has (some combination of clickthrough and bounce rate), the structured data is your gateway to rich results, answers, and the knowledge panel. But there's no reason to suggest this impacts user experience - there are no widespread ways for users to see your structured data even if Google has not utilized it as part of its UX. You should always include structured data when you can because the performance impacts on CTR are astronomical, but there is no argument in the queries and clicks philosophy that suggests it improves initial SEO.
Manual actions and site security issues may be places where Google is indeed on the offensive. Google has no benefit in directing a user to a ransomware site or a site with links for malware. Here is where the editorial function of Google overrides the idea of queries and clicks - Google has content standards and it doesn't matter how many people are searching for content that Google believes doesn't meet those standards. It will be blocked.
Should I Hire an SEO?
If you have been an operating a site and haven't been getting great organic results, the return is well worth the cost. Regardless of their reasoning, successful SEOs have the experience to apply what works and generate results quickly. If you aren't a native SEO - this is worth its weight in gold.
If you're just going live however, the points above and the idea that SEO is a function of user experience - when used as a guiding principle in site design, can start you out on the right foot. If the user gets the answer to the question they ask Google - quickly, accurately, and without pain, you will likely have positive results.
Ryan is the former Chief Product Officer at Medullan, CTO at Be the Partner, and CTO and General Manager at Vitals. He currently works as a fractional CTO offering strategy as a service to growth-stage companies in health care and education.