Build Progressive Web Apps That Don’t Destroy Your SEO

Traditional web applications render HTML on the server. WordPress, Magento, Ruby on Rails, and most common web applications and frameworks work this way.

Server-side rendering is ideal for SEO: complete pages are delivered to browsers and crawlers alike and, from the client’s perspective, navigation looks much the same as on a static site.

Over the last few years, JavaScript-based web applications that are rendered on the client have become more popular. Originally used for complex applications with rich functionality like Google Docs, developers increasingly turn to JavaScript and frameworks such as React and Vue to build web applications that would once have been rendered on the server.

Source: Fabian Grohs on Unsplash

Progressive Web Apps are the culmination of this trend. A PWA leverages web technologies such as Service Workers, Web App Manifests, and the Cache API to provide a web experience similar to native applications. Progressive Web Apps offer improved perceived performance, seamless page transitions, offline functionality, and home screen installation on mobile devices.

These benefits have driven widespread adoption of Progressive Web Applications. Pinterest rebuilt its web application as a PWA and saw engagement increase by 60% and a 40% increase in user-generated ad revenue. Uber’s mobile site is a 50kb PWA that loads in less than three seconds on a 2G network. There are real business advantages to embracing Progressive Web Applications if the SEO is handled intelligently.

The basic PWA loading process looks like this:

  • An HTML app shell is delivered to the browser along with the app’s JavaScript.
  • The app shell and associated script files are cached.
  • The JavaScript code is run, fetching content from the server.
  • The HTML is rendered and displayed in the browser.
  • Page transitions are handled by the JavaScript app, which may also prefetch, lazy-load, and cache content behind the scenes.

In a real application, this process is more complex: I haven’t mentioned Service Workers here because Google’s crawlers don’t support them, but that won’t usually be an SEO problem for a well-designed web app.

Client-Side JavaScript and SEO

The first thing to understand is that a PWA doesn’t bring any SEO advantage just because it’s a PWA. If your business is happy with the experience its current site offers, then there is no reason to implement a Progressive Web App. The only reason to implement a PWA is to take advantage of the development, business, and user experience benefits it can deliver.

Source: Artem Sapegin on Unsplash

Google Processes JavaScript

Google’s crawlers have been able to execute JavaScript and index JavaScript-generated content for several years. The crawlers follow links in JavaScript generated content and PageRank passes through those links. They are also able to process and index metadata like title tags and meta description tags that are generated by JavaScript.

Google’s crawlers have some limitations where newer JavaScript is concerned, and it’s always wise to test to see whether the specific JavaScript features used in your application are supported – something we address later in this article. If you discover that Google doesn’t support the ES6 or other modern JavaScript features used in your application, consider using a tool like Babel to transpile JavaScript files to more widely supported versions or using server-side rendering to present the crawler with pre-rendered pages.

Google’s crawlers are more sophisticated when it comes to handling complex JavaScript pages than they once were, but there are a few issues that site owners need to keep in mind when rendering content on the client.

Make sure crawlers can access JavaScript and CSS files

Some SEOs consider it a best practice to block Googlebot’s access to JavaScript and SEO files. This is obviously not a good idea when the content is rendered by JavaScript.

Google isn’t the only search engine

Google will probably crawl your JavaScript web application correctly. The same is not true of DuckDuckGo, Baidu, Yandex, and other search engines. If these search engines are important to your business, be sure to read the next section of this article, where I’ll discuss server-side rendering of JavaScript applications.

Page navigation can be a problem in some cases

For the most part, page navigation will work perfectly well in JavaScript apps that handle routing on the client. One exception is URLs that include fragments. For optimal compatibility, avoid using # in URLs.

Use canonical links to avoid duplicate content issues

It is common for businesses to deploy a Progressive Web App while leaving the original desktop or mobile sites in place. This can cause duplicate content issues if pages are not properly canonicalized. On PWA pages, include a link with a canonical attribute that points to the desktop or other preferred version of the page.

Avoid ES6 code in production

Google’s crawlers are based on the Chrome 41 browser, which was released before ES6. Transpile code that relies on ES6 features like arrow functions, or avoid it altogether.

Source: freestocks.org on Unsplash

JavaScript and Server-Side Rendering

Client-side code execution and page rendering are responsible for many of the advantages of Progressive Web Applications, but there are a couple of major disadvantages.

First, as I have already mentioned, alternative search engine crawlers don’t have sophisticated JavaScript processing capabilities. Second, the initial render of PWAs can be slower than is ideal because the app shell, JavaScript code, and associated libraries have to be downloaded before content can be downloaded and rendered.

Slow-loading PWAs are a particular problem for mobile users with low bandwidth connections: once the initial load is finished, a PWA will be faster than a traditional server-side application, but that’s no comfort to users who have to wait multiple tens of seconds before the site is usable.

The solution to both problems is the same: render the initial view on the server. Web applications that can be rendered on both the client and the server are often called isomorphic applications or universal applications. The initial view is rendered on the server and sent to the browser along with the JavaScript modules needed to display it; the rest of the JavaScript code and content is loaded as needed.

Rendering React or Vue pages on the server is more complex than simply serving up an app shell and some JavaScript files, but with modern tools like Next.js it’s relatively straightforward to create React apps that provide excellent performance and SEO on the initial load alongside all the benefits of a PWA. Next.js is a JavaScript framework that, by default, provides server-side rendering for the initial load, automatic code splitting, and page-based client-side routing.

Server-side rendering isn’t a magic bullet for PWA performance issues: it significantly increases server load, adds complexity to the development process, and can cause performance problems of its own with larger initial HTML downloads and increased latency. Developers should consider the advantages and drawbacks of server-side rendering on a case-by-case basis.

Source: Phil Desforges on Unsplash

Dynamic Rendering

In a recent development, Google introduced dynamic rendering as an option for JavaScript applications. When a browser requests a page, it is rendered on the client as usual. But when Googlebot crawls the app, it is rendered on the server. This has advantages for Google and web application owners. Google doesn’t have to use its resources to render JavaScript applications, and app owners can be sure that their pages are seen by Googlebot as intended.

Google suggests that web applications should use dynamic rendering for apps with a lot of content, that change rapidly, or that rely on modern JavaScript features supported in versions of Google Chrome that are newer than version 41.

Testing JavaScript Application SEO

The advice in this article will help you implement a JavaScript web application without disadvantaging your site in the search results. However, site owners should always check that JavaScript-generated pages are being crawled and indexed. Googlebot is good at processing JavaScript, but it isn’t perfect.

The Fetch as Google tool is the best way to test how Google sees your web app. Be sure to use the “Fetch and Render” functionality, otherwise JavaScript code won’t be processed.

In Summary

Not too long ago, client-side web applications were an SEO nightmare. Today, businesses can safely embrace Progressive Web Apps and other JavaScript web applications without negatively affecting SEO.

Note: The opinions expressed in this article are the views of the author, and not necessarily the views of Caphyon, its staff, or its partners.

The post Build Progressive Web Apps That Don’t Destroy Your SEO appeared first on AWR.