westward-pointing arrow

How Motive Handles Launches

April 2023
Link copied
link icon
Summarized by
OpenAI logo

Motive is built upon a robust technical foundation. It emphasizes design, speed, and powerful built-in tools to enhance user experience and optimize dealership operations.

Technically, Motive leverages server-side rendering for dynamic content, ensuring a fast-loading and responsive website. Its AI-powered features, integrated with OpenAI, provide natural language search capabilities and an intelligent chat system, simulating human-like conversations about vehicles. Further, Motive offers group-level analytics, inventory sharing for large auto groups, real-time pricing management, and an AI-powered content engine to streamline content creation.

In this technical paper, we will explore how Motive handles SEO transitions for site moves, including the pre-launch and post-launch steps.

This paper is intended for highly technical and capable readers who understand the complexity and importance of SEO transitions.

II. Pre-Launch Steps

Accessing the Google Search Console (GSC) of the current site:

The first step in handling an SEO transition for a site move is to gain access to the GSC of the dealer's current website.

Creating a domain property if necessary:

In some cases, the current website may be using a property that is www only. This means that only data from the www version of the site is captured by search console. Since Motive sites use the root domain (without the www), we need to ensure that all data from the root domain and www is captured. To do so, we’ll create a domain property if one does not already exist. This property consolidates data from both the root domain and www versions of the site, providing a more accurate representation of the site's search performance. Unfortunately, creating a new domain property does not carryover historical data, so in these situations we operate with a bit less visibility than ideal.

Creating a domain property in GSC

Exporting a list of ranking pages:

To ensure that high-performing pages are recreated on the new site, a list of ranking pages is exported from the current search console. This list includes pages that are currently ranking for specific keywords, along with their corresponding click-through rates (CTRs) and average positions.

Recreating high-performing pages:

Once the list of high-performing pages has been exported, the next step is to recreate these pages on the new site. It is crucial to replicate these pages letter-for-letter, including the headings, images, alt text, formatting, and URL structure (if possible).

However, it's important to note that not all pages will be recreated letter-for-letter, initially. Pages that generate 90% of clicks are manually recreated to ensure that they have a 1:1 match, but all pages will be recreated eventually. This is because some pages may be less critical to the website's overall performance, but could still contribute to the website's search engine ranking. Pages that generate many impressions are also taken into consideration, as they can potentially generate more clicks if properly optimized on the new site.

To streamline the process of recreating pages, we built a page generation engine to recreate the remaining 10% of pages. This process saves hundreds of hours internally and allows us to be more detail-oriented. More on that process below.

Using a page generator to import pages into Motive's CMS:

One of the key technical achievements of Motive is our highly modular CMS, and it enables us to operate with a high degree of automation when it comes to site launches and page generation.

The page generator works by crawling the existing pages on the old site and converting them to markdown. This process is made possible by our powerful page analyzer, which is able to extract the content, images, and formatting from the existing pages and convert them into markdown code. Once the markdown code has been generated, it can be easily imported into Motive's CMS with just a few clicks. It also imports the SEO title and Description from the old page.

Mapping old slugs to the new slugs:

To ensure that the new site maintains the search engine ranking of the old site, it's important to match old URL slugs to new ones as closely as possible. This process involves examining the URL structure of the old site and replicating it on the new site. In some cases, the old URL slugs may not be replicable due to technical limitations. For example, if the old URLs have parent folders (e.g., /folder/page) that cannot be replicated on the new site, or if they end with a file extension (e.g., .html), which cannot be emulated. In such cases, the URL slug is matched to the new site as closely as possible.

Preview of our URL mapping in progress

However, matching old URL slugs to new ones is not always a straightforward process, and it can be challenging to ensure that the new site maintains the search engine ranking of the old site. To address this challenge, we use OpenAI embeddings to map old URLs to new ones accurately.

OpenAI embeddings are a type of artificial intelligence that can understand the relationship between words and phrases. In the context of URL mapping, this means that OpenAI embeddings can analyze the content of a page and identify the most important keywords and phrases that are relevant to the page's content. By doing so, OpenAI embeddings can help ensure that the old URLs are mapped to the most relevant new URLs on the new site.

The process of mapping old URLs to new ones with OpenAI embeddings begins with the generation of embeddings for each page on the old site. These embeddings capture the semantic content of each page, allowing us to compare the old pages to the new pages and identify the most relevant new URLs.

Once the embeddings have been generated, we match the old URLs to the new URLs. The algorithm takes into account a range of factors, including the semantic content of the old and new pages, the URL structure of the old and new sites, and the overall relevance of the old pages to the new site.

Despite the accuracy of OpenAI embeddings, there is always the possibility of errors. To ensure that the mapping is correct, a human QAs the mapping. We examine the mapping to ensure that each old URL is mapped to the correct new URL.

Adding 301 redirects with a redirect engine:

After the old URLs have been mapped to the new URLs using OpenAI embeddings, the next step is to add 301 redirects to the old URLs. This process is crucial, as it ensures that any traffic that is directed to the old URLs is redirected to the new URLs on the new site. This helps to maintain the search engine ranking of the old site and ensures that users can still access the content they are looking for on the new site.

To make the process of adding 301 redirects as easy as possible, we have built a redirect engine native to our platform.

How we upload redirects to Motive

To begin the process, we uploads a CSV file that contains the old and new slugs for each page. With just a few clicks of a button, the redirect engine adds redirects to the old URLs, ensuring that any traffic that is directed to those URLs is redirected to the new URLs on the new site.

In addition to adding redirects for the main pages of the site, we also add redirects for the VDPs. To ensure that users can still access these pages on the new site, Motive maps the old VDP URLs to the new VDP URLs using OpenAI embeddings, and then adds 301 redirects to the old VDP URLs. This process ensures that any traffic that is directed to the old VDP URLs is redirected to the new VDP URLs on the new site.

Google is able to detect the 301 redirects for page changes, which means that even if a URL has changed, Google still ranks it similarly to how it was ranking before the site change. This is why it is critical to ensure that all old URLs are redirected to the corresponding new URLs on the new site.

Post-launch:

After the new site is launched, there are several important steps that need to be taken to ensure that the site is properly indexed by search engines. The following steps are taken to help expedite the transition process and minimize any negative impact on search engine rankings:

  1. Remove old sitemaps from search console: Once the new site is live, we pull any old sitemaps from the search console.
  2. Add our sitemap: After removing the old sitemaps, we add the Motive sitemap to the search console.
  3. Manually submit core pages for indexing to speed up transition: To help expedite the transition process, we manually submit the core pages of the new site for indexing. This includes the home page, the VDPs, and other high-priority pages.
  4. Validate fixes on all outstanding page indexing issues: After the new site has been live for a period of time, it is important to validate any outstanding page indexing issues that may have occurred during the transition process. We examine the search console to identify any pages that may not be indexed correctly.
  5. Update all backlinks to use the root domain instead of www: To ensure that all backlinks are properly redirected to the new site, we update all backlinks to use the root domain instead of www. This helps to consolidate any link equity that may have been split between the two domains.
  6. Update key backlinks to target the new page slugs if necessary: In addition to updating all backlinks to use the root domain, we also update key backlinks to target the new page slugs if necessary.

Quality Assurance and Indexing Checks

We use a combination of manual and automated quality assurance checks. One of the automated checks is a bot that clicks through all the pages that are ranking in the search console to ensure that none of them return a 404.

The bot works by first extracting the list of pages that are ranking in search console. It then visits each of these pages, simulating a user click. During this process, the bot monitors the server response for any 404 errors. If a 404 error is detected, the bot alerts us.

However, there are cases where the URL includes parameters, such as UTM params, which can cause redirect issues. In these cases, the bot is unable to detect whether the URL is functioning correctly or not. In such cases, we rely on manual testing to ensure that these pages are functioning correctly.

An indexing report in GSC

In addition to the automated checks, we also manually request indexing on an HTML sitemap of VDPs. This is because our SRPs are dynamic, meaning that there are no internal links to these pages. By manually requesting indexing on the HTML sitemap, we help to ensure that the VDPs are properly indexed by search engines, further minimizing any negative impact on search engine rankings.

Optimizing Crawl Budget for Large Sites

For larger websites, Google may not crawl all pages at the same rate, as it sets the crawl budget based on its own understanding of the site's ability to handle simultaneous crawl requests. However, sometimes Google underestimates a site's server power, especially if the site is using elastic load balancing, as is the case with Motive. This causes Google to crawl the site less frequently or not to completion despite the site having appropriate capacity to handle much higher levels of crawling.

Changing the crawl budget

In order to ensure that Google is able to crawl the site effectively, we may manually adjust the crawl budget within Google's Webmaster Tools. This is an older tool that not many people are aware of. We have seen a positive impact in the crawl report.

To elaborate on this further, when a search engine crawls a site, it consumes server resources, and therefore, Google's algorithm is designed to optimize the crawl budget to ensure that it crawls the most important pages efficiently.

By setting a hard budget for crawling, we are able to take control of how much of the site's resources are used by Google's algorithm. This is especially important for larger sites where a higher crawl rate is needed to ensure that all pages are indexed and ranked properly.

Our experience with optimizing crawl budgets for large sites has shown that this process can have a positive impact in frequency and size of crawls.

Crawl report example from GSC

Results

After completing the site move process, we typically see a small increase in clicks, as well as a decrease in total impressions. While this often results in a higher click-through rate (CTR), it is important to note that this can be misleading since an increase in CTR may be due to the decrease in impressions rather than an increase in clicks.

We typically see an increase in average position, which may seem like a positive result. However, this can also be deceiving since we optimize for clicks and strive to maintain or improve the number of clicks generated by the site.

One of the positive outcomes we often see is an increase in "Impressions of Good URLs," which can significantly impact the ranking quality of many pages on the site. This can help boost the site's search engine ranking and visibility.

However, we also sometimes observe drops in impressions for search terms that are longer tail or that we didn't anticipate being necessary. For example, a search for older model year specifications. To address this, we plan to implement a keyword graph that can help us better isolate where these drops are happening and adjust our strategy accordingly.

A better dealership website