VUE JS GOOGLE SEO

Matthew Carter
Hello friends, my name is Matthew Carter. I’m a professional link builder for a large SEO agency in New York City.

TRY OUR PACKAGES

Small Package
$49
Human Articles with Readable Spin
TIER 1
15 High DA web 2.0 Properties
10 High DA Trusted Profiles
20 High DA Bookmarks
5 EDU Profiles
50 Powerful Web 2.0 Profiles
10 Blog Platform Articles
10 High DA Documents Links
10 Manual Image Submission Links
10 Niche Related Blog Comments
1 Weebly Post
1 Tumblr Post
1 Wordpress Post
1 Blogspot Post
1 Medium Post
50 Facebook Reshares
50 Twitter Retweets
100 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Instant Link Indexer Services
Drip Feed Pinging
Order Now
Medium Package
$99
80 Unique Human Articles no spin
TIER 1
30 High DA web 2.0 Properties
25 High DA Trusted Profiles
30 High DA Bookmarks
7 EDU Profiles
70 Web 2.0 Media Profiles
25 Blog Platform Articles
15 High DA Documents Links
12 Image Sharing Backlinks
20 Niche Related Blog Comments
10 High DA Forum Profiles
10 Press Releases
Video Creation
10 Video Submissions
Power Point Creation
10 Power Point Submissions
1 EDU Blog Post
1 Weebly Post
5 High PA Tumblr Posts
1 Wordpress Post
1 Blogspot Post
1 Medium Post
1 Mix.com Share
1 Flickr Share
1 Myspace Share
100 Facebook Reshares
100 Twitter Retweets
250 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Article Submission
Guestbook Comments
Social Network Profiles
Static Links
Referrer Links
Instant Link Indexer Services
Drip Feed Pinging
Order Now
Big Package
$159
140 Unique Human Articles no spin
TIER 1
50 High DA web 2.0 Properties
40 High DA Trusted Profiles
40 High DA Bookmarks
10 EDU Profiles
100 Web 2.0 Media Profiles
50 Blog Platform Articles
20 High DA Documents Links
15 Image Sharing Backlinks
30 Niche Related Blog Comments
20 High DA Forum Profiles
20 Press Releases
Video Creation
20 Video Submissions
Power Point Creation
20 Power Point Submissions
1 EDU Blog Post
1 Weebly Post
10 High PA Tumblr Posts
1 Wordpress Post
1 Blogspot Post
1 Medium Post
1 Mix.com Share
1 Flickr Share
1 Myspace Share
1 Penzu Post
1 Ex.co Post
1 Behance Post
1 Voog Post
1 Linkedin Post
1 EzineArticle Post
250 Facebook Reshares
300 Twitter Retweets
500 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Article Submission
Guestbook Comments
Social Network Profiles
Static Links
Referrer Links
Instant Link Indexer Services
Drip Feed Pinging
Order Now

PORTFOLIO

If you aren't concerned about the other search engines (Yahoo, Bing, etc.), then you can probably get by without having to do anything. Though even with Google, not everyone has seen perfect results.

Here's an example of a Vue image component that takes a Prismic image as a prop. In this example, the image has a default width of 1200px, with responsive views called "medium" and "small" set to 800px and 400px respectively.

Vue is a fast, modern web framework, but it comes with concerns regarding search engine optimization (SEO). On this page, you'll learn a little about SEO strategies with Vue and Prismic.

The concern.

In Prismic, use the Key Text field to add title description fields to your Custom Type. We recommend putting these fields in a dedicated "SEO" tab.

You can add other meta tags as needed.

When you template your image in Vue, you can use those responsive views to create a responsive Image component.

When you add an Image field to your Custom Type, you can define multiple responsive views for different screen sizes.

In your Vue app, use vue-head to inject your title and description into your page header:

With basic Vue, most page content is rendered after the page initially loads, in the user's browser. A traditional search engine crawler will only see the initial page load — without any content — and assume the page is basically blank, which is bad for SEO.

Responsive images.

Many developers are using server-side rendering (SSR) and static site generation (SSG) to optimize their websites for search engines. With Vue there are many ways to implement SSR and SSG. We recommend Nuxt.js.

You can add meta tags to your website header with the vue-head plugin.

Install the plugin:

"Times have changed. Today, as long as you're not blocking Googlebot from crawling your JavaScript or CSS files, we are generally able to render and understand your web pages like modern browsers."

Understanding SEO with Vue.

One way to optimize your Vue app for SEO is by optimizing your images. Prismic can help make your images responsive and facilitate alt text.

The main SEO issue that comes with Vue.js has to do with search engine crawlers.

One of the most important SEO considerations is your meta tags. Meta tags go inside the <head> of every page on your website and describe the page's content.

Notice how the image element also includes dynamic alt text from Prismic. An alt text input is included on all images in the Prismic editor. Alt text helps with SEO and accessibility.

This seems to be the trend with the other major search engines as well.

It’s important to know this, as it could determine whether or not the site’s content is indexable by a search engine, and just as importantly, how well its content is ranked.

The quintessential Vue.js SPA example is the Vue HackerNews Clone 2.0. This is an open source project provided by the Vue team to demonstrate the full capabilities of Vue and effective design patterns.

Don’t forget the “O” in “SEO” stands for “optimization”. Being indexed be a search engine is not enough; we want our sites to rank well, too. Fetch As Google tells us how a page is seen, but not how the page compares to the competition.

Googlebot.

If you can’t use SSR for one of the above reasons, there is another way: prerendering. With this approach, you run the app with a headless browser in your development environment, snapshot the page output, and substitute your HTML files with this snapshot in the server’s response.

Just like a browser, Googlebot must have a JavaScript engine with a particular subset of implemented web standards and ES features, and its particular idiosyncrasies for how those are implemented.

My advice: if you are going to use the SPA architecture, be sure to provide server rendered or prerendered pages.

“What happens when you use React without server-side rendering is that the crawler halts on the very first page because it can’t see any hyperlinks to follow. It makes the crawl process incredibly slow and inefficient. And that is why websites built on React (and similar JavaScript platforms) perform worse in Google than websites that primarily serve plain HTML to the crawler. plain HTML websites can be crawled very efficiently, newly added and changed content will be crawled and indexed much quicker, and Google is much better able to evaluate the crawl priority of individual pages on such websites.”

I deployed this app to a Heroku instance and ran it through Fetch As Google. In the image below, the screenshot on the left shows how Googlebot saw it, and the screenshot on the right shows how a user would see it. They appear to be identical.

Many developers saw Google’s 2014 announcement about JavaScript rendering as an end to SEO concerns about SPA content. In fact, there’s no guarantee that Googlebot will correctly render a page, and if it does, it still may rank the page lower than static HTML pages in competing sites.

Removing server-side rendering.

According to this video from Google developers Addy Osmani and Rob Dodson (released Nov 2017), Googlebot is currently based on Chrome 41. There are a lot of new APIs that have been released since version 41, and if you used any of them, presumably Googlebot would not be able to render and index your page.

The state of the virtual DOM is converted to an HTML string, then injected into the page before as it is sent to the client. When the page reaches the browser, the JavaScript app will seamlessly mount over the existing content.

Even with client rendering only, Googlebot had no trouble seeing the content. I also waited a few days to see if Google had indexed the app. It had:

The bottom line is, if SEO is critical, you can’t rely on your SPA’s client rendering and must ensure content comes included in your pages.

Optimization.

SSR guarantees your page will be crawler friendly, as the page content is complete regardless of how, or even if, the crawler runs JavaScript.

This doesn’t mean you need to abandon the SPA architecture, though. There are two techniques, server-side rendering and prerendering, that both can achieve the desired outcome.

I’m Anthony Gore and I’m a web developer with a crush on Vue.js. I’m a Vue Community Partner, curator of the weekly Vue.js Developers Newsletter , and the creator of Vue.js Developers .

While this architecture works for human users viewing the page in a browser, what about search engine crawlers? Can crawlers run JavaScript? If so, will they wait for the AJAX calls to complete before crawling the page?

Server-side rendering (SSR) is where a page is rendered by the web server as part of the server request/response cycle. In the case of Vue.js and other similar frameworks, this is done by executing the app against a virtual DOM.

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.