Python-for-SEO-Learn-5-easy-and-engaging-projects-250px

Python for SEO: Learn 5 easy and engaging projects

June 13, 2023 - 9  min reading time - by Majid Jorkesh
Home > Technical SEO > Python for SEO: Learn 5 easy and engaging projects

Are you an SEO who’s interested in learning Python and using it for your various projects? Whether you’re just starting out or have some programming experience, this article is for you.

In it, we’ll introduce five straightforward projects that are ideal for SEOs looking to hone their skill set. These five tasks provide a great introduction to learning Python while also helping you to gain practical experience. By the end of it, you’ll have an in-depth understanding of these projects as well as be equipped with all of the necessary knowledge to take your Python skills up a notch. Let’s get started!

1. Title and description checker

This Python script allows you to verify if a web page has meta titles and descriptions or not. All you need to do is enter the URL of the page you want to check and this script will work its magic.

Why is this project useful?

Meta tags – the HTML tags which are invisible on-page – describe your content, they help let search engines know what your page is about, and they serve as the source of the text displayed in the SERPs. They are thus essential to on-page SEO.

Without using meta titles and descriptions to provide users with an accurate preview of your page and encourage click-through, search engines could display generic or irrelevant titles and descriptions.

By strategically crafting titles and descriptions that explicitly detail what your business does, you provide the information search engines need to accurately represent your business online and subsequently drive the right traffic to your site.

Imagine you own a website selling handmade jewelry. The site could consider the following for a title and description:

  • Title: Handmade jewelry for every occasion
  • Description: Shop our collection of unique and stylish handmade jewelry to find your piece now

With the below script users can quickly determine whether a given page already includes these components and make the necessary adjustments to boost SEO performance on that particular page.

 

import requests
from bs4 import BeautifulSoup`
url = input("Enter the URL of the web page: ")
# Make a request to the URL
response = requests.get(url)
# Parse the HTML content of the page using BeautifulSoup
soup = BeautifulSoup(response.content, 'html.parser')
# Find the meta title tag and get its content
meta_title = soup.find('meta', {'name': 'title'})
if meta_title:
print('Meta title found: ' + meta_title['content'])
else:
print('No meta title found')
# Find the meta description tag and get its content
meta_description = soup.find('meta', {'name': 'description'})
if meta_description:
print('Meta description found: ' + meta_description['content'])
else:
print('No meta description found')

How does it work?

  1. The script prompts the user to enter the URL of a web page they wish to analyze.
  2. The script sends an HTTP request using the requests library and stores its response in a variable.
  3. BeautifulSoup then parses and interprets the HTML content.
  4. The script searches for “title” and “description” meta tags within HTML content.
  5. If these tags are found, their content is printed.
  6. Conversely, if none are present, a message indicating their absence will be displayed instead.

2. Schema scraper

This project allows users to enter a URL of a webpage and it automatically retrieves its schema information using requests and beautifulsoup4 libraries in Python.

Why is this project useful?

Schema markup, or structured data, gives search engines additional details about a page’s content, thus improving its visibility in the SERPs and providing richer experiences as this information is used to drive rich snippets.

Let’s say, for example, you own a webpage for a local restaurant. By employing schema markup, you can provide search engines with additional information such as its address, phone number, menu items and hours of operation. This data may then appear prominently in the SERPs as rich snippets.

Using this script, SEOs can quickly retrieve and examine any page’s schema data as compared to manually searching and analyzing each individual web page manually. It saves users both time and energy.

 

import requests
from bs4 import BeautifulSoup
# Get the URL input from the user
url = input("Enter the URL of the page to retrieve schema information: ")
# Send a request to the URL and retrieve the HTML content
response = requests.get(url)
html_content = response.text
# Use beautifulsoup4 to parse the HTML content and find the schema information
soup = BeautifulSoup(html_content, 'html.parser')
schema_tags = soup.find_all('script', attrs={'type': 'application/ld+json'})
# Print the schema information for each tag found
for schema_tag in schema_tags:
schema_data = schema_tag.string.strip()
print(schema_data)

How does it work?

  1. The user is asked to enter the URL of a page for which they wish to retrieve schema information using the input() function.
  2. The requests library then sends a GET request to that specified URL in order to obtain its HTML content.
  3. BeautifulSoup then parses the HTML content and generates a BeautifulSoup object.
  4. ElegantSoup then searches for all instances of an element within that range.
  5. If any such tags are found, their content (which should be valid JSON-LD data) is extracted using the string attribute.
  6. The extracted JSON-LD data is then printed to the console using the print() function.

3. Page speed checker

This program offers a simple Python interface to quickly assess the speed of any website page.

Why is this project useful?

Page speed is an integral component of user experience and search engine optimization; using this script, SEOs and developers can easily measure page load times of webpages to identify any performance issues that may have an impact on either their user experience or search engine rankings. As you probably well know, Google made page speed a ranking factor back in 2010, so it holds some weight in regards to importance.

If you are in the e-commerce industry and your business depends on a strong conversion rate from your site, making sure your pages load as quickly as possible is an absolute must.

 

import requests
import time
def check_page_speed():
# Prompt user to enter the URL to check
url = input("Enter the URL to check page speed: ")
# Send a GET request to the given URL and record the start time
start_time = time.time()
response = requests.get(url)
# Record the end time and calculate the page load time
end_time = time.time()
page_load_time = end_time - start_time
# Print the page load time in seconds
print(f"Page load time: {page_load_time:.2f} seconds")
# Example usage:
check_page_speed()

How does it work?

The program prompts the user to enter the URL of the page they want to check.

  1. The program sends a GET request to the URL using the requests library, retrieving its HTML content from the server.
  2. It records the start time of this GET request using the time library.
  3. The program receives a response from the server that contains the HTML content of the requested page.
  4. It records the end time of this GET request using its time library and calculates page load time by subtracting its start time from its end time.
  5. The program prints the page load time, in seconds, to the console.

4. HTTPS checker

This Python code snippet takes a URL as an input and checks whether it has an SSL certificate or not.

[Case Study] How business-oriented SEO increases traffic and conversions

Thanks to a winning SEO strategy focusing on managing duplicate content, improving site structure, and analyzing Googlebot behavior, carwow grew as a business. Discover how to show real results from an SEO strategy based on business ROI.

Why is this project useful?

A valid certificate helps ensure data transmission between user browsers and websites remains encrypted and secure, and this code snippet allows SEOs to quickly assess whether an SSL status checker exists for the websites on which they are working.

Site users have become accustomed to trusting SSL (Secure Sockets Layer) certification as an indication that their personal data and online activities are safe from hackers and other attacks. SSL establishes an encrypted link between a web server and user browsers, guaranteeing all transmitted data remains private and confidential.

 

import requests
def check_ssl(url):
response = requests.get(url)
if response.status_code != 200:
return "Error: Could not access the website"
if response.url.startswith('https://'):
return "The website has a valid SSL certificate"
else:
return "The website does not have a valid SSL certificate"
# Example usage
url = input("Enter the website URL: ")
result = check_ssl(url)
print(result)

How does it work?

  1. The requests library is used to send a GET request to a URL and receive its response.
  2. If the response status code is not 200, then it indicates that access was denied on the website; an error message will be returned instead.
  3. If the URL of the response begins with https://, it indicates that the website has a valid SSL certificate and thus a success message will be returned.
  4. Conversely, if it does not start with https://, then this indicates that there is an issue and an error message will be displayed instead.
  5. The check_ssl function takes a URL as input, performs the necessary steps, and returns a message indicating whether the website has an active SSL certificate or not.

5. Image alt checker

Below is a Python code example using the BeautifulSoup library to parse an URL’s HTML content and determine whether images on that page have an “alt” attribute or not.

Why is this project useful?

Google, Bing and other search engines use the alt attribute of images to understand their context within pages and their relevance to overall content. Now that multimodal search is becoming more prevalent, it’s key to make sure your images can be found and indexed as well.

By including descriptive and relevant alt text for images, you can enhance accessibility for visually impaired visitors while helping search engines better comprehend them.

import requests
from bs4 import BeautifulSoup
# Function to check if an image has an "alt" attribute
def has_alt_attribute(img):
return "alt" in img.attrs
# Get the URL from the user
url = input("Enter the URL of the page to check: ")
# Make a request to the URL and get the HTML content
response = requests.get(url)
html_content = response.content
# Parse the HTML content using BeautifulSoup
soup = BeautifulSoup(html_content, 'html.parser')
# Find all the image tags on the page
images = soup.find_all('img')
# Check if each image has an "alt" attribute or not
for image in images:
if has_alt_attribute(image):
print(f"The image with source '{image['src']}' has an 'alt' attribute.")
else:
print(f"The image with source '{image['src']}' does not have an 'alt' attribute.")

How does it work?

  1. This code defines a function called has_alt_attribute that takes an image tag as input and returns TRUE if the tag contains an “alt” attribute, otherwise FALSE.
  2. We then ask the user to enter the URL of the page they wish to check and make a request to that URL for its HTML content.
  3. We use BeautifulSoup to parse this HTML content and locate all image tags present on that page.
  4. We then loop through each image tag and use the has_alt_attribute function to check if it has an “alt” attribute or not.
  5. Finally, we print a message for each tag indicating whether or not it has an “alt” attribute.

Conclusion

Learning Python can be a valuable skill for SEO specialists looking to enhance their ability to analyze and optimize websites. Python has a number of SEO uses including automating repetitive tasks, scraping web data, and developing tools to assist with daily work tasks.

Working through such projects and understanding their underlying code allows SEOs to gain knowledge and time that will ultimately help increase their productivity. Hope you’ll find these scripts helpful. Let us know what you think in the comments section below.

Majid Jorkesh See all their articles
Majid Jorkesh is an accomplished SEO specialist and a valued member of the team at Digid.ca. With six years of experience in the field of SEO, Majid has helped a number of online businesses improve their search engine rankings and increase their online visibility.
Related subjects: