Node Js Seo Tools

Node JS SEO Tools is a collection of useful utilities for web developers and SEO professionals. This library was created to help you save time and work smarter, not harder.

The library is built around the idea that SEO functionality should be built into the base of your application instead of added as an afterthought.

Node Js Seo Tools

When choosing the tech stack for your application, you have to make several considerations. The programming language and framework that you choose affect the development time, the application’s performance, and its discoverability online.

One of the most important ways of getting discovered online is via organic searches through search engines. Search engines decide which results will be displayed based on a few key factors. These are generally in the control of the developer, and you can “optimize” them to improve the search-ability of your application. This is known as Search Engine Optimization (SEO), and it is one of the most important aspects in building and marketing your product.

In the next section, we will discuss some of the most essential SEO tags that every webpage should have. Then we’ll move forward into framework-based SEO considerations.

Do you want to boost your website’s traffic?

Take advantage of FLUX DIGITAL RESOURCE seo tools

Learn how your Marketing team can update your Javascript App with ButterCMS.

Essential SEO Tags (that your page can’t live without)

undefined

Title

The title is one of the most important parts of a page’s SEO. This is the title that is used by search engines when displaying your page in the results list. It is also the title used when you share your page on social media. You can set your webpage’s title like this:

 <head>
 	<title>Page Title</title>
 </head>

Description

The description of the page is the description that appears below the title in search engine results. It is also the description used in shares. To set the description for your webpage, just add this:

 <head>
 	<meta name="description" content="This is the description of the page" />
 </head>

Open Graph Image 

This tag doesn’t matter much in search engine results, but it’s vital for social media. It allows you to choose which image to display when the page is shared on social media websites like Twitter, Facebook, and LinkedIn. Set an attractive image via this tag to ensure that your link attracts a lot of attention.

 <head>
 	<meta property="og:image" content="https:/yourdomain.com/image.png"/>
 </head>

 A setup with all the necessary SEO tags would look like this:

 <head>
 	<title>Page Title</title>
 	<meta name="description" content="This is the description of the page" />
 	<meta property="og:image" content="https:/yourdomain.com/image.png"/>
 </head>

In the next section, we will discuss how you can insert these tags into your webpages according to the tech stack of your application.

Types of frameworks and their implications

No framework—pure HTML

If you’re not using any Javascript-based framework to build your application, all the SEO tags are in your control for each page via the HTML file. Therefore, no special library is required to set the tags.

Search engine crawlers also like this setup because it allows them to easily crawl your pages.

Using a framework (JS or server-rendered applications)

As we talk about optimizing SEO for framework-rendered apps, we’ll focus specifically on the React framework and on optimizing for the Google search engine. However, the following section will still be relevant to all frameworks and search engines.

In React, the most common way of rendering is client-side rendering. In essence, it’s a single HTML root file with a React script attached to it that renders the webpage at runtime. All routes are generated over that specific HTML file only. This happens on the client’s side after a route is queried and the data is received. This is why it’s called client-side rendering. However, the developer doesn’t have control over the SEO tags in this case since only a single HTML file exists. So, in cases like these, certain libraries are used to set SEO tags dynamically.

But there is another issue that client-side–rendered apps face. Search engine crawlers can’t crawl these webpages properly because the pages are generated at runtime. To solve that, certain additions have been made by the developers of web crawlers. For example, the Google web crawler queues JS-rendered pages for rendering if a page is detected to be JS-rendered.

However, the search engine indexing for such a page is delayed until the JS is rendered and the page is readable by a bot. The bot has to do this for every single page in your website. This is a long process, so errors occurring at any step will prevent that page from being indexed by the search engine.

undefined

Learn how your Marketing team can update your Javascript App with ButterCMS.

To combat these problems, Google suggests the following solutions:


Pre-rendering 

Pre-rendering is the technique of converting client-side–rendered applications into static HTML files through the process of rendering. A pre-rendering tool renders the application by visiting each route individually and generating an HTML file for each one of them. However, this process becomes quite slow for larger applications, and dynamic linking is not possible in a pre-rendered React app since each page has to be present at compile time. So, it is limited to static pages or fetching dynamic content using query parameters.

Isomorphic rendering (client + server side)

Isomorphic rendering is also known as hybrid rendering. When a user-agent, such as a Google bot, queries a URL of such an application, a server-rendered page is sent. Otherwise, a client-rendered page is sent to the rest of the users. This ensures that search engines index the page correctly and that client-side rendering still works for other clients. However, this type of rendering is very complicated to use and doesn’t offer any advantages over completely server-rendered React. Therefore, it isn’t widely used and doesn’t have any good packages or libraries for most frameworks.

Server-side rendering

In server-side rendering, the page is completely rendered on the server side before being sent to client. So, the client gets a complete HTML page as a response. This is good for SEO too, as search engine crawlers get a completely rendered webpage, which makes their job easier. In turn, it also increases the speed of your pages being indexed by the crawlers.


In the next section, we’ll discuss the best libraries in each framework that can be used to implement best SEO practices in your frontend.

Optimizations for specific frameworks

1. React-based frameworks

undefined

Client-side rendering

When React is rendered on the client’s side, react-helmet can be used, which allows a user to generate meta tags while rendering each page.

Pre-rendering

You should use react-snap or react-snapshot when using create-react-app. GatsbyJS is also a good framework for rendering React applications to static HTML files.



Server-side rendering

You should use a framework like NextJS to perform server-side rendering with React. This will allow the search engines to easily index your webpages.

2. NodeJS/ExpressJS 

undefined

Pre-rendering

 The library prerender-node works with any Node-rendered framework to render all routes as static webpages.

Server-side rendering

NodeJS is a server-side language, and Express is a routing framework for it. So, you get server-side rendering out of the box with Node. The only thing you’ll need to take care of is setting the SEO tags dynamically via ejs.

3. AngularJS 

Client-side rendering

You can use a library like ngx-seo-page when working with client-side Angular. It allows you to dynamically set the SEO tags during page render.

Pre-rendering

Modules like angular-prerender can be used to pre-render Angular apps. It visits both server and client routes and combines them to form a static client.

Server-side rendering

Angular Universal provides native support to Angular for server-side rendering the apps. You can combine it with ngx-seo-page to set SEO tags on your server-rendered application.

For more information, check out this post about Angular SEO.


4. VueJS 

undefined

Client-side rendering

For client-side rendering with Vue, there are not many frameworks that allow dynamically setting SEO tags. One of them is vue-seo. However, its last update was two years ago, so pre-rendering or server-side rendering is preferred for better SEO.

Pre-rendering

To pre-render a Vue single-page application, an app like vue-cli-plugin-prerender-spa can be used. It is a robust solution that works with zero configuration.

Server-side rendering

Using frameworks like Nuxt.JS, you can easily create server-rendered Vue applications. It allows you to render your application on the server, run a client-side app, or generate pre-rendered static files easily.

5. Django/Python 

undefined

Server-side Rendering

The default way of using Django is via server-side rendering. HTML templates are rendered server-side according to the data passed to them via the server. So, you get the benefits of setting dynamic SEO tags by default.

Did you know ButterCMS works seamlessly with all of these frameworks? Our newly launched WRITE API makes integration smooth for developers and our content dashboard makes churning out content easy for marketers.

undefined

Summarizing the Content (TLDR)

In summary, we see that client-side–rendered applications face issues in being indexed correctly by search engines, and developers also face issues while setting SEO tags in these applications. However, these challenges can be overcome by relying on a variety of solutions based on the frameworks and rendering techniques used.


What you can do further to improve your website’s SEO

To improve your application’s SEO, follow all the SEO guidelines presented in this article to establish basic SEO correctness. When using header tags within your website (<h1><h2>, etc.), ensure you use all the relevant keywords—keywords that should also be repeated throughout the accompanying content. If you do these things, you’ll be ranking high in the search results in no time!

automate google search with javascript

Programming and automation are increasingly popular topics in the SEO industry, and rightfully so.

Leveraging new ways to extract, transform, and analyze data at scale with minimal human input can be incredibly useful.

Although speed is important, one of the main benefits of using automation is that it takes the weight off our shoulders from repetitive tasks and leaves us more time to use our brains.

Read on to learn some of the benefits of using JavaScript to automate SEO tasks, the main avenues you can take to start using it, and a few ideas to hopefully spark your curiosity.

Why Learn Automation With JavaScript?

A lot of fantastic automation projects in the community come from SEO professionals coding in Python including Hamlet Batista, Ruth Everett, Charly Wargnier, Justin Briggs, Britney Muller, Koray Tuğberk GÜBÜR, and many more.

However, Python is only one of the many tools you can use for automation. There are multiple programming languages that can be useful for SEO such as R, SQL, and JavaScript.

Outside of the automation capabilities you’ll learn in the next section, there are clear benefits from learning JavaScript for SEO. Here are just a few:

1. To Advance Your Knowledge to Audit JavaScript on Websites

Whether or not you deal with web apps built with popular frameworks (e.g., Angular, Vue), the chances are that your website is using a JavaScript library like React, jQuery, or Bootstrap.

(And perhaps even some custom JavaScript code for a specific purpose.)

Learning to automate tasks with JavaScript will help you build a more solid foundation to dissect how JavaScript or its implementation may be affecting your site’s organic performance.

2. To Understand and Use New Exciting Technologies Based on JavaScript

The web development industry moves at an incredibly fast pace. Hence, new transformative technologies emerge constantly, and JavaScript is at the center of it.

By learning JavaScript, you’ll be able to better understand technologies like service workers, which may directly affect SEO and be leveraged to its benefit.

Additionally, JavaScript engines like Google’s V8 are getting better every year. JavaScript’s future only looks brighter.

3. To Use Tools Like Google Tag Manager That Rely on JavaScript to Work

If you work in SEO, you may be familiar with Tag Management Systems like Google Tag Manager or Tealium. These services use JavaScript to insert code (or tags) into websites.

By learning JavaScript, you will be better equipped to understand what these tags are doing and potentially create, manage and debug them on your website.

4. To Build or Enhance Your Own Websites with JavaScript

One of the great things about learning to code in JavaScript is that it will help you to build websites as side projects or testing grounds for SEO experiments.

There is no better way to understand something than by getting your hands dirty, especially if what you want to test relies on JavaScript.

Paths to Leveraging JavaScript for SEO Automation

JavaScript was initially developed as a browser-only language but has now evolved to be everywhere, even on hardware like microcontrollers and wearables.

Grow Your Business With Vetted Freelancers Today
Fiverr Business gives your team the tools to collaborate and delegate with the world’s largest selection of talented freelancers for any need.Get StartedADVERTISEMENT

For the purposes of SEO automation, there are two main environments where you can automate SEO tasks with JavaScript:

  • A browser (front-end).
  • Directly on a computer/laptop (back-end).

SEO Automation with Your Browser

One of the main advantages that separate JavaScript from other scripting languages is that browsers can execute JavaScript. This means the only thing you need to get started with JavaScript automation is a browser.

Automation Using the Browser’s Console

The easiest way to get started is using JavaScript directly in your browser’s console.

There are some easy and fun automations you can do. For example, you can make any website editable by typing “document.body.contentEditable = true” in your console.

Example of using JavaScript directly in your browser’s console.

This could be useful for mocking up new content or headings on the page to show to your clients or other stakeholders in your company without the need for image editing software.https://c3535db7689c1a29f8abad7814d70339.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

But let’s step it up a bit more.

The Lesser-Known Bookmarklets

Since a browser’s console can run JavaScript, you can create custom functions that perform specific actions like extracting information from a page.

However, creating functions on the spot can be a bit tedious and time-consuming. Therefore, Bookmarklets are a simpler way to save your own custom functions without the need for browser plugins.

Bookmarklets are small code snippets saved as browser bookmarks that run functions directly from the browser tab you are on.

An example of a Bookmarklet.

For example, Dominic Woodman created a bookmarklet here that allows users to extract crawl stats data from the old Google Search Console user interface and download it to a CSV.https://c3535db7689c1a29f8abad7814d70339.safeframe.googlesyndication.com/safeframe/1-0-38/html/container.html

It might sound a bit daunting, but you can learn how to create your own Bookmarklets following the steps in this great resource on GitHub.

Snippets, a User-Friendly Version of Bookmarklets

If you use Chrome, there is an even simpler solution using Snippets. With this, you can create and save the same type of functions in a much more user-friendly way.

For example, I’ve created a small Snippet that checks how many “crawlable” links are on-page and download the list to a CSV file. You can download the code from GitHub here.

Chrome snippet link counter SEO.

While these are usually small tasks that are “nice to have” you’d probably want to do more heavy-lifting tasks that can help with your SEO workload in a more significant way.

Therefore, it’s much better to use JavaScript directly on your laptop (or a server in the Cloud) using Node.js.

SEO Automation in the Back-End with Node.js

Node.js is software that lets you run JavaScript code on your laptop without the need for a browser.

There are some differences between running JavaScript in your browser and JavaScript on your laptop (or a server in the Cloud) but we’ll skip these for now as this is just a short intro to the topic.

Nodejs homepage screenshot.

In order to run scripts with Node.js, you must have it installed on your laptop. I have written a short blog post where I go step by step on how you can install Node as well as a few additional setup tips to make it easier to get you started.

Although your imagination is the limit when it comes to automation, I’ve narrowed it down to a few areas that I see SEO professionals come back to when using Node.js.

I will include scripts that are ready to run so you can get started in no time.

Extracting Data From APIs

Collecting information from different sources to provide insights and recommend actions is one of the most common jobs in SEO.

Node.js makes this incredibly simple with different options, but my preferred go-to module is Axios.

// Create an index.js file inside a folder & paste the code below

// Import axios module
const axios = require('axios');

// Custom function to extract data from PageSpeed API
const getApiData = async (url) => {
  const endpoint = 'https://www.googleapis.com/pagespeedonline/v5/runPagespeed';
  const key = 'YOUR-GOOGLE-API-KEY' // Edit with your own key;
  const apiResponse = await axios(`${endpoint}?url=${url}&key=${key}`); // Create HTTP call
  console.log(apiResponse.data); // Log data
  return apiResponse.data;
};

// Call your custom function
getApiData('https://www.searchenginejournal.com/');

To start interacting with APIs, you need a module that’s able to handle HTTP requests (HTTP client) and an endpoint (a URL to extract information).

In some cases, you might also need an API key, but this is not always necessary.

For a taste of how Node.js interacts with APIs, check out this script I published that uses Google’s PageSpeed API to extract Core Web Vitals data and other lab metrics in bulk.

Scraping Websites

Whether you would like to monitor your own website, keep an eye on your competitors or simply extract information from platforms that don’t offer an API, scraping is an incredibly useful tool for SEO.

Since JavaScript interacts well with the DOM, there are many advantages to using Node.js for scraping.

The most common module I’ve used for scraping is Cheerio, which has a very similar syntax to jQuery, in combination with an HTTP client like Axios.

// Import modules
const cheerio = require('cheerio');
const axios = require('axios');

// Custom function to extract title
const getTitle = async (url) => {
  const response = await axios(url); // Make request to desired URL
  const $ = cheerio.load(response.data); // Load it with cheerio.js
  const title = $('title').text(); // Extract title
  console.log(title); // Log title
  return title;
};

// Call custom function
getTitle('https://www.searchenginejournal.com/');

If you need the rendered version of a website, popular modules like Puppeteer or Playwright can launch a headless instance of an actual browser like Chrome or Firefox and perform actions or extract information from the DOM.

Chris Johnson’s Layout Shift Generator is a great example of how to use Puppeteer for SEO. You can find more info here or download the script here.

There are also other options like JSDOM that emulate what a browser does without the need for a browser. You can play around with a JSDOM-based script using this Node.js SEO Scraper built by Nacho Mascort.

Processing CSV and JSON Files

Most of the time, the data extracted from APIs comes as JSON objects, and JavaScript is perfect for dealing with those.

However, as SEOs, we normally deal with data in spreadsheets.

Node.js can easily handle both formats using built-in modules like the File System module or more simplified versions like csvtojson or json2csv.

Whether you want to read data from a CSV and transform it into JSON for processing, or you’ve already manipulated the data and you need to output to CSV, Node.js has your back.

// Import Modules
const csv = require('csvtojson');
const { parse } = require('json2csv');
const { writeFileSync } = require('fs');

// Custom function to read URLs and convert it to JSON
const readCsvExportJson = async () => {
  const json = await csv().fromFile('yourfile.csv');
  console.log(json); // Log conversion JSON

  const converted = parse(json);
  console.log(converted); // Log conversion to CSV
  writeFileSync('new-csv.csv', converted);
};

readCsvExportJson();

Create Cloud Functions to Run Serverless Tasks

This is a more advanced case, but it’s incredibly helpful for technical SEO.

Cloud computing providers like Amazon AWS, Google Cloud Platform, or Azure make it incredibly simple (and cheap) to set up “instances of servers” that run custom scripts created for specific purposes without the need for you to configure that server.

A useful example would be to schedule a function that extracts data from the Google Search Console API automatically at the end of every day and stores the data into a BigQuery database.

Cloud function exmple to extract search console data and load it to big query.

There are a few moving parts in this case, but the possibilities are truly endless.

Just to show you an example of how to create a cloud function, check out this episode of Agency Automators where Dave Sottimano creates his own Google Trends API using Google Cloud Functions.

A Potential Third Avenue, Apps Script

For me, it made more sense to start with the less-opinionated approach.

But Apps Script may offer a less intimidating way to learn to code because you can use it in apps like Google Sheets which are the bread and butter of technical SEO.

Appscript homepage screenshot.

There are really useful projects that can give you a sense of what you can do with Apps Script.

For example, Hannah Butler’s Search Console explorer or Noah Lerner’s Google My Business Postmatic for local SEO.

If you are interested in learning Apps Script for SEO, I would recommend Dave Sottimano’s Introduction to Google Apps Scripts. He also gave this awesome presentation at Tech SEO Boost, which explains many ways to use Apps Script for SEO.

Final Thoughts

JavaScript is one of the most popular programming languages in the world, and it’s here to stay.

The open-source community is incredibly active and constantly bringing new developments in different verticals – from web development to machine learning – making it a perfect language to learn as an SEO professional.

What you’ve read about in this article is just the tip of the iceberg.

Automating tasks is a step toward leaving behind dull and repetitive everyday tasks, becoming more efficient, and finding new and better ways to bring value to our clients.

Hopefully this article reduces in a small capacity the slightly bad reputation that JavaScript has in the SEO community and instills a bit of curiosity to start coding.

Conclusion

Let us know your thoughts in the comment section below.

Check out other publications to gain access to more digital resources if you are just starting out with Flux Resource.
Also contact us today to optimize your business(s)/Brand(s) for Search Engines

Leave a Reply

Flux Resource Help Chat
Send via WhatsApp