GOOGLE SEO DYNAMIC CONTENT

Matthew Carter
Hello friends, my name is Matthew Carter. I’m a professional link builder for a large SEO agency in New York City.

TRY OUR PACKAGES

Small Package
$49
Human Articles with Readable Spin
TIER 1
15 High DA web 2.0 Properties
10 High DA Trusted Profiles
20 High DA Bookmarks
5 EDU Profiles
50 Powerful Web 2.0 Profiles
10 Blog Platform Articles
10 High DA Documents Links
10 Manual Image Submission Links
10 Niche Related Blog Comments
1 Weebly Post
1 Tumblr Post
1 Wordpress Post
1 Blogspot Post
1 Medium Post
50 Facebook Reshares
50 Twitter Retweets
100 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Instant Link Indexer Services
Drip Feed Pinging
Order Now
Medium Package
$99
80 Unique Human Articles no spin
TIER 1
30 High DA web 2.0 Properties
25 High DA Trusted Profiles
30 High DA Bookmarks
7 EDU Profiles
70 Web 2.0 Media Profiles
25 Blog Platform Articles
15 High DA Documents Links
12 Image Sharing Backlinks
20 Niche Related Blog Comments
10 High DA Forum Profiles
10 Press Releases
Video Creation
10 Video Submissions
Power Point Creation
10 Power Point Submissions
1 EDU Blog Post
1 Weebly Post
5 High PA Tumblr Posts
1 Wordpress Post
1 Blogspot Post
1 Medium Post
1 Mix.com Share
1 Flickr Share
1 Myspace Share
100 Facebook Reshares
100 Twitter Retweets
250 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Article Submission
Guestbook Comments
Social Network Profiles
Static Links
Referrer Links
Instant Link Indexer Services
Drip Feed Pinging
Order Now
Big Package
$159
140 Unique Human Articles no spin
TIER 1
50 High DA web 2.0 Properties
40 High DA Trusted Profiles
40 High DA Bookmarks
10 EDU Profiles
100 Web 2.0 Media Profiles
50 Blog Platform Articles
20 High DA Documents Links
15 Image Sharing Backlinks
30 Niche Related Blog Comments
20 High DA Forum Profiles
20 Press Releases
Video Creation
20 Video Submissions
Power Point Creation
20 Power Point Submissions
1 EDU Blog Post
1 Weebly Post
10 High PA Tumblr Posts
1 Wordpress Post
1 Blogspot Post
1 Medium Post
1 Mix.com Share
1 Flickr Share
1 Myspace Share
1 Penzu Post
1 Ex.co Post
1 Behance Post
1 Voog Post
1 Linkedin Post
1 EzineArticle Post
250 Facebook Reshares
300 Twitter Retweets
500 Pinterest Repins
TIER 2
Blog Comments LinkJuice
Bookmarks LinkJuice
Article Submission
Guestbook Comments
Social Network Profiles
Static Links
Referrer Links
Instant Link Indexer Services
Drip Feed Pinging
Order Now

PORTFOLIO













Historically, javascript-powered websites have not fared well in search — they are user-friendly but not bot-friendly. This is attributed to the limited crawl budget of the Google Search Bot and the resource-intensive nature of rendering javascript content. When search engine crawlers encounter heavy javascript content, they often have to index in multiple waves of crawling. This fractured process results in missed elements, like metadata and canonical tags, that are critical for proper indexing.

Yes! Not only does Google approve of dynamic rendering, they strongly recommend it and even coined the term.

The websites that can benefit the most from dynamic rendering are ones that are big, with complicated Javascript and lots of pages that need to be indexed.

Adoption of Dynamic Rendering.

You could have all the content resources in the world, but if Google can’t see that actual content, what good is it doing? So, we see that a lot. Companies have bigger indexation issues than they have any idea about, because it’s kind of hard to know. You see the crawl stats, right? You’re like, “Oh, they’re crawling me. I’m good.” And you see that they’re downloading information, but you don’t really know exactly what they’re downloading and how much they are actually accessing the stuff that you’re working on. With dynamic rendering, all those problems just get eliminated. All the content’s being indexed, and content affects rankings and rankings affect traffic. So you get a pretty significant benefit. If the site is pretty heavy in JavaScript or difficult to crawl, all of a sudden, they’re going to become privy to all this new information in a very short amount of time. And that’s actually going to impact rankings and traffic and all those other good things.

Unsurprisingly, if you give Google what they want, they’ll send you a lot of traffic. Huckabuy Founder-CEO Geoff Atkinson.

Google recommends that webmasters incorporate dynamic rendering in at least three instances . First, it is recommended if you have a large site with rapidly changing content that requires quick indexing. Second, it is recommended if your website relies on modern javascript functionality. Third, it is recommended if your website relies on social media sharing and chat applications that require access to page content.

Dynamic rendering means that your site will render differently depending on what calls it; users see the normal client-side version of the site while search engine bots see a version designed specifically for them. It’s one of the biggest changes Google has made in the past decade.

Google prefers content written in static HTML, but they are also interested in organizing search results that reflect the internet as it is. Dynamic rendering presents an opportunity to access, crawl, and index large websites and dynamic pages that rely on frequently changing heavy javascript content.

So, you can strip that stuff out in a dynamically rendered version. So for our Huckabuy Cloud, for example, if we were to take a customer that’s on our product and look at their actual page, in the Huckabuy Cloud, or that dynamically rendered version of the page, it almost looks identical, but it’s like 20-40%, as the size of the previous page — it’s wider, it’s faster. It’s flat HTML, it looks very similar, but you are going to see some of the dynamic stuff getting pulled out. So, chat boxes and things like that.

How does Huckabuy dynamically render web pages?

On the server-side, javascript content is converted into a static HTML version preferred by search engine bots. This allows them to fully access, crawl, and index webpage content. It’s one of the biggest technical SEO initiatives Google has endorsed in years.

Dynamic rendering is one of the most important technical SEO initiatives that Google has rolled out in the last decade. Geoff Atkinson, Founder-CEO of Huckabuy.

We are often asked why the SEO community and marketing community at large have been slow to adopt dynamic rendering. Part of the reason is the fact that these departments do not have a general skillset that includes subsets of technical SEO, like Javascript SEO. Furthermore, they don’t always have the assistance of development team members who could help solve the issues then implement the solution. As a result, it is an initiative that tends to fall by the wayside as more resources are devoted to less-technical tactics like content creation and link building instead.

Think about cloaking like a classic “bait and switch.” A website might serve a page to the Search Bot about cats, but the user sees content that is fundamentally different – for example, content about dogs instead. Google takes issue with these types of cases and penalizes them accordingly. But dynamic rendering is not cloaking. It is about giving Google similar data about a page in a format that they can crawl and index quickly, easily, and cheaply as they desire. They acknowledge and support this methodology in their documentation here.

Should your business take care of dynamic rendering in-house?

It is possible, but it will cost more and quality will likely suffer. This is the type of service that is better to outsource than to train for and risk being done at a lower quality. First, you have to have at least one capable developer that can alter your tech stack and wire together some form of rendering service. So, time and maintenance are going to cost you some amount. Second, if you do it wrong or Google changes things and your development team is slow to make an adjustment, your website suffers the consequences. On your own, you are completely in charge of how the most important visitor, the Google Search Bot, engages with your website. If you decide to dedicate one or two engineers from your development team to this process, it is imperative they are experts.

In 2018, Google announced its support for dynamic rendering as a workaround solution for search bots to access, crawl, and index javascript content converted to static HTML. You can watch their presentation below:

As JavaScript has taken off across the internet, Google’s job of crawling and indexing has become very difficult and requires a lot of money, time, and resources. Because almost every page on the internet now has JavaScript on it, Google’s rendering costs just became way too high. And so they started asking webmasters to serve up a separate version of sites specifically for them, one without Javascript code.

It’s a pretty simple concept. Pages load dynamically based on what calls them. For example, if you go to a URL on your mobile phone, you’ll get one experience and if you go to the same URL on your desktop, you’ll get a slightly different experience. A site will be dynamically rendered to best fit the user experience for whatever device they’re using — mobile, tablet, desktop, and anything in-between.

No! It has no effect on users.

I am trying to figure out a way to make it work well with Google. For example, if I type in “Apartment San Francisco” in Google, the results will be:

I don’t have experience with SEO and not sure how to do it for my site. Please share some experience or pointers to help me get started. Thanks a lot!

Follow up question: I saw Googlebot can now run Javascript. I want to understand a bit more of this. When a specific url of my SPA app is opened, it will do some network query (XHR request) for a few seconds and then the page content will be displayed. In this case, will GoogleBot wait for the http response?

The question is, the page content is purely depending on user’s query. User can search by city name, state name, zip code, etc, to show different results, and it’s not possible to put them all into sitemap. How google can crawl the content for these kind of dynamic page results?

In my case, I have a single page application (powered by AngularJS, using router to show different state) that provides some location-based search functionalities, similar to Zillow, Redfin, or Yelp. On mt site, user can type in a location name, and the site will return some results based on the location.

I am new to SEO and just want to get the idea about how it works for Single Page Application with dynamic content.

And when user click on these links, the sites will display the correct result. I am thinking about having similar SEO like these for my site.

I saw some tutorial says we need to prepare static html specifically for Search Engines. If I only want to deal with Google, does it mean I don’t have to serve static html anymore because Google can run Javascript?

Click on a star to rate it!

Average rating 0 / 5. Vote count: 0

No votes so far! Be the first to rate this post.