ChatGPT prompts for analysing Search Console query data

US$7.99

A series of ChatGPT prompts designed to enable effortless analysis of your Google Search Console query data.

No technical or analytical experience necessary

Key benefits:

  • Understand Search Intent of your audience

  • Pinpoint the reasons behind traffic changes

  • Group queries strategically (e.g. brand, product, navigational, etc)

  • Create insightful visualisations

  • Identify emerging search trends and opportunities

  • Summarise insights in a PDF format

  • Sample data included

  • You will be able to turn this into a reproducible set of insights and reports

REQUIREMENTS:

  • ChatGPT 4 (or any other AI tool which support data uploads)

  • CSV or Excel file with your Google Search Console data, with the following columns:

    • Query

    • Date

    • Impressions

    • Clicks

    • Position

Download Prompt Pack & Sample Data

Before you analyse Google Search Console data, you of course need to prepare the data first.

You need a CSV or an Excel file like the "Sample data.csv" file included in our prompts package (see screenshot below). The file must contain the following columns:

  • Query

  • Date

  • Impressions

  • Clicks

  • Position

Note that the order of the columns is immaterial as long as they are named as shown above.

Ways to get Google Search Console data

1. Google Sheets Add-on

One of the easiest and fastest ways of extracting data from Search Console is by using a Google Sheets add-on called "Search Analytics for Sheets."

The add-on is free to use, but you won't be able to extract more than 25,000 rows at a time. However, that shouldn't stop you since you can repeat the processes, pull the data in smaller chunks, and then consolidate (e.g., pull only several months at a time).


2. Google Looker Studio (aka Data Studio)

Another easy way to get GSC data is by using Looker Studio (formerly known as Google Data Studio). Simply connect your Data Studio account to Google Search Console using the existing connector. Then, build a table with the necessary columns as mentioned earlier. Then export the table as csv (make sure it includes all rows).

3. R script 

This is a great way of getting GSC data if you do have a bit of technical knowledge (and it's also free).

All you need is access to R Studio (you can either install it locally on your computer, or use a free web based version of RStudio available via Posit Connect). You can find sample R script below

# This is an example R code which can be used to extract Query data from Google Search Console

########################################################################
# install necessary packages

if(!require(searchConsoleR)) {
  install.packages("searchConsoleR"); 
  require(searchConsoleR)
} 


if(!require(library(xlsx))) {
  install.packages("library(xlsx)"); 
  require(library(xlsx))
} 

########################################################################
#AUTHENTICATE
scr_auth()

########################################################################
#LIST ALL MY WEBSITES in SEARCH CONSOLE
list_websites()

########################################################################
#GET daily (global)


data_query <- search_analytics(siteURL = "https://www.analyticshacker.com"
                                   , startDate = Sys.Date()-93    # starting from 3 months ago
                                   , endDate =  Sys.Date()-3   # up to 3 days ago 
                                 # , endDate = '2020-10-26'
                                   , dimensions = c("query","date")
                                   , searchType = c("web")
                                   #, dimensionFilterExp = c("country==usa")
                                   , aggregationType = c("byProperty")
                                  , rowLimit = 25000   # default is 1000
                                   , prettyNames = TRUE
                                   , walk_data =  "byBatch")

# show number of rows returned
nrow(data_query)

# preview the data
head(data_query)

# drop ctr column
data_query <- data_query[, -which(names(data_query) == "ctr")]

# remove commas from the query column
data_query[] <- lapply(data_query , gsub, pattern=',', replacement='')

# show current working directory
getwd()

# write the data to a csv file (it will write to the current working directory)
write.table(data_query, file = "data.csv",append=FALSE,row.names=FALSE, na="",col.names=FALSE, sep=",")