When testing a new version of your website on staging, it’s possible to encounter technical problems in Google Search. Luckily, you can use your browser and Google Search Console tools to conduct a quick first analysis before bringing out more advanced troubleshooting tools.

Inspecting the HTML and Network Tabs

To start the analysis, we will use the mobile-friendly test on the staging URL. If some content is missing, we can inspect the website with Developer Tools open. In the Elements tab, we can see the HTML representation of the DOM of the page, which is the closest representation of the rendered HTML in the Google Search Console tools.

Inspecting the HTML and Network Tabs

Inspecting the HTML and Network Tabs

We can use this to search for content on the page and see whether it is or isn’t in the DOM and where it is placed. Additionally, the Network tab allows us to see each request to the server and the response. We can use the waterfall diagram to learn where time is spent, and we can see all requests and response headers.

Using the Network Tab to Locate Missing Content

The Network tab also helps us search for missing content to check if it has even made its way from the server to the browser. If the content is not visible in the Network tab, we can scroll the page a bit to see if the content shows up. If it does, we can see in the Network panel in Devtools which request made the content visible. From there, we can jump into the piece of code that made the request specifically.

In this case, we can see that the code is only running when scrolling happens. As Googlebot does not interact with pages, including scrolling, we need to find a way to load this content in Googlebot too without relying on scroll events.

Additional Features of the Network Tab

The Network tab also has several other features, including disabling the cache, setting the network transfer speed or other network conditions, and the user agent that we use to make our requests with. However, it’s important to note that setting the user agent to Googlebot might not work as expected, as Googlebot also respects robots.txt and some sites might do IP lookups to see if any request really comes from a Google data center.

Recap and Conclusion

In conclusion, your browser has built-in debugging powers, such as a view of the DOM, inspection of network requests and responses, settings for user agent network conditions and caching, and you can use these together with the testing tools in Google Search Console to debug a range of issues. By using these tools, you can conduct a quick analysis of your website and identify any technical problems that need to be addressed.


What is Browser DevTools?

Browser DevTools is a set of web development and debugging tools built into most web browsers. It enables developers to easily troubleshoot issues related to website SEO performance. Through the DevTools, you can inspect HTML and CSS, analyze network performance and resources, view the document object model (DOM), debug JavaScript, and more.

What can Browser DevTools do for website SEO performance?

Browser DevTools gives you a powerful suite of tools to troubleshoot website SEO performance. You can analyze page resources, debug JavaScript, inspect HTML and CSS, view the document object model (DOM), identify and fix server errors, and more.

How do I access DevTools?

DevTools is built into most web browsers including Google Chrome, Firefox, Safari, and Edge. To access DevTools, open your browser and press F12 (on Windows/Linux) or Cmd+Opt+I (on Mac) to open the DevTools window.

Is there any specific DevTools website SEO setting?

No, there is no specific DevTools website SEO setting. Instead, DevTools gives you access to the different components of your website, so you can troubleshoot any issue which might be affecting your website SEO performance.

What are some common sources of website SEO issues?

Common sources of website SEO issues include:

  • Incorrectly implemented meta tags
  • Inaccurate or missing page titles or descriptions
  • Incorrectly written page content
  • Incorrect internal and external link structure
  • Incorrectly assembled schemas and structured data
  • Poor server performance
  • Incorrectly optimized images