Back to Blog

Connect GitHub to Google Sheets for Automated Data Exports

The instructions provided in this article are sufficient for creating an automated GitHub to Google Sheets integration. But please note that we are not responsible for the outcome if you use the instructions incorrectly, specify incorrect settings, or if there are changes in the app’s API. If you need help with the setup, we can offer this premium integration at an additional fee.

GitHub REST API allows you to get miscellaneous data from GitHub. You can do this programmatically or without a single code line. The latter option looks rather interesting, right? If so, read on to learn how you can load data from GitHub to Google Sheets with a few clicks and automate this process.

Sync GitHub with Google Sheets: What options are available?

Meet Sarah, a manager of a product with the source code hosted on GitHub. As a good product manager, she wants to be sure that the project evolves. For this, she needs to analyze project statistics such as pull requests, commits, contributors, etc. Sarah can get all this information to a spreadsheet using the GitHub API. There are two methods:

  • Code-based – a few lines of code in Python or App Script will do the job. 
  • No-code – a few clicks in Coupler.io will let you set up a GitHub – Google Sheets integration and load your data automatically on a schedule.

Why spend time coding if there is a simpler and clickable way to import data from GitHub to Google Sheets? Let’s check it out!

GitHub – Google Sheets integration without coding

Sarah’s GitHub – Google Sheets integration will rest on Coupler.io. It’s a solution for importing data from multiple sources into Google Sheets, Excel, or BigQuery. 

Figure 01. Coupler.io as a solution to import data from different sources into Excel

Coupler.io provides 15+ ready to use integrations, but GitHub is not on the list yet. At the same time, Coupler.io provides a JSON integration to connect API to Google Sheets. We’ll check out how it works later. For this, we first need to obtain a GitHub API token.  

How to get a GitHub API token

  • Sign in to Github and go to this link. Alternatively, you can go to Settings => Developer settings => Personal access tokens. Click Generate new token.
  • On the next screen, you’ll need to configure your access token:
    • Specify what the token is for
    • Select the expiration period 
    • Select scopes 
2 configure token

Note: Scopes let you specify exactly what type of access you need. For example, the read:user scope allows you to read ALL user profile data.

  • Click Generate token once you’re ready and copy the token you get.
3 access token

The access token is a valuable element for connecting GitHub to Google Sheets with Coupler.io. Now, we can go further.

Connect GitHub with Google Sheets

  • Sign up to Coupler.io with your Google or Microsoft account.
  • Click Add new importer and name it. In our case, Sarah named it Github to Google Sheets. After that, you need to complete three steps to set up the integration:

Set up source 

  • Select JSON as the source application. Click Continue.
4.1 json source
  • Specify the JSON URL according to the data you want to export from GitHub. Click Continue.

The basic GitHub API URL is:

https://api.github.com/

Depending on the data entity you’re exporting from GitHub, the basic JSON URL will be attached with respective endpoints. For example, to get a list of GitHub organizations, the JSON URL is the following:

https://api.github.com/organizations
4.2 json source

Check out the data entities you can export from GitHub to Google Sheets and their JSON URLs.


  • Select GET as the HTTP method.
  • Add the following strings to the HTTP headers field:
Authorization: token {insert-your-token}
Accept: application/vnd.github.v3+json
4.3 authorization header

Note: Authentication with the personal access token is not the only way to access the GitHub API. It works well for our case, but for OAuth Apps or GitHub Apps in production, you should authenticate using the web application flow or on behalf of the app installation.

Other parameters, URL query, Request Body, Columns, and Path are optional.

  • URL query allows you to query the exported data. For example, to export all closed user projects, you’ll need to specify the following query in the URL query string:
state:closed

Note: All available query options can be found in the GitHub API documentation.

  • Request body is not used with GET requests and allows you to specify the request body.
  • Columns allows you to specify which columns you want to export.
  • Path allows you to select nested objects from the JSON response.

Now you can jump to the destination settings.

Set up destination (Google Sheets)

  • Choose Google Sheets as the destination app.
5.1 google sheets destination
  • Connect your Google account.
5.2 connect google account
  • Pick a Google Sheets file on your Google Drive and select a sheet which will be receiving the data exported from GitHub. You can create a new sheet by entering a new name.
5.3 destination

Optional parameters are Cell address and Import mode. You can read more about them in the Coupler.io knowledge base

You can get your GitHub data to Google Sheets right away by clicking Save and Run. To see the results, click the View Results button.

6 view results

However, Sarah wants to get her data updated recurrently, say once a day. For this, she needs to set up the schedule.

Set up schedule

Enable the Automatic data refresh and configure the frequency for data exports:

  • Interval
  • Days of the week
  • Time of the update
  • Time zone
7 schedule

This is the common flow to set up a GitHub to Google Sheets integration for exporting a specific type of data. If you need to export different data, you’ll need to make as many importers as the data types you’re going to load. Fortunately, Coupler.io allows you to easily copy the importers.

8 copy importer

What GitHub data you can pull into Sheets

You can find the full-length information about the resources available in the GitHub REST API in the GitHub reference documentation

Since Sarah needs the information to analyze the project health, we introduced the GitHub resources that should meet her needs:

DescriptionEndpoint to add to the basic JSON URL*
GitHub stats
Get the weekly commit activity/repos/{owner}/{repo}/stats/code_frequency
Get the weekly commit count/repos/{owner}/{repo}/stats/participation
Get the hourly commit count for each day/repos/{owner}/{repo}/stats/punch_card
Get the last year of commit activity/repos/{owner}/{repo}/stats/commit_activity
Get all contributor commit activity/repos/{owner}/{repo}/stats/contributors
Other actionable resources
List organization repositories/orgs/{org}/repos
Get a repository/repos/{owner}/{repo}
List repository contributors/repos/{owner}/{repo}/contributors
List repository collaborators/repos/{owner}/{repo}/collaborators
List repositories for a user/users/{username}/repos
List repository tags/repos/{owner}/{repo}/tags
List repository teams/repos/{owner}/{repo}/teams
List branches/repos/{owner}/{repo}/branches
List commits/repos/{owner}/{repo}/commits
List pull requests associated with a commit/repos/{owner}/{repo}/commits/{commit_sha}/pulls
List pull requests/repos/{owner}/{repo}/pulls
List commits on a pull request/repos/{owner}/{repo}/pulls/{pull_number}/commits
List repository events/repos/{owner}/{repo}/events
  • the basic JSON URL is 
https://api.github.com/

Exporting large amounts of records via the GitHub API

Sarah exported a list of pull requests from her repo and only got 30 records. But in fact, the list is much longer. The reason is that each request to the GitHub API is paginated to 30 items by default. To export the rest of the records, you need to use the page parameter in the URL query field specifying the next pages: 2, 3, etc. Omitting the page parameter in the request means that the GitHub API will return the first page.

9.1 url query page

Note: In this case, the append mode offered by Coupler.io is a perfect way to collect all your records in a spreadsheet. It appends the data from the next pages to the previously exported data.

For some data entities, you can use the per_page parameter, which allows increasing the page size up to 100 records. 

9.2 url query page

Meanwhile some endpoints use cursor-based pagination which does not allow you to navigate to a specific page using only the page parameter. You’ll need to use additional parameters, such as after, before, since, etc. For example, pagination for exporting a list of users is powered exclusively by the since parameter, so here is how it may look:

9.3 url query page

Import GitHub data to Google Sheets using App Script

The option above allowed Sarah to get the GitHub data to her spreadsheet with a few clicks. At the same time, the flow can be implemented in Google Sheets using the App Script. But it will take some of your time coding. 

For example, you can create a custom function to fetch data from GitHub to Google Sheets with the following syntax:

=IMPORTAPI("json_url", "api_token")
  • json_url is the GitHub JSON URL to fetch data from
  • api_token is your access token

And here is how it works in action:

10 importapi

The code sample we used for that is the slightly modified ImportJSON by bradjasper

function ImportAPI(url, api_token, query, parseOptions) {
  var header = {headers: {
                'Authorization': 'token '+ api_token,
                'Accept': 'application/json'}};
  return ImportAPIAdvanced(url, header, query, parseOptions, includeXPath_, defaultTransform_);
}
 
function ImportAPIAdvanced(url, fetchOptions, query, parseOptions, includeFunc, transformFunc) {
  var jsondata = UrlFetchApp.fetch(url, fetchOptions);
  var object   = JSON.parse(jsondata.getContentText());
  
  return parseJSONObject_(object, query, parseOptions, includeFunc, transformFunc);
}
 
function URLEncode(value) {
  return encodeURIComponent(value.toString());  
}
 
function parseJSONObject_(object, query, options, includeFunc, transformFunc) {
  var headers = new Array();
  var data    = new Array();
  
  if (query && !Array.isArray(query) && query.toString().indexOf(",") != -1) {
    query = query.toString().split(",");
  }
 
  // Prepopulate the headers to lock in their order
  if (hasOption_(options, "allHeaders") && Array.isArray(query))
  {
    for (var i = 0; i < query.length; i++)
    {
      headers[query[i]] = Object.keys(headers).length;
    }
  }
  
  if (options) {
    options = options.toString().split(",");
  }
    
  parseData_(headers, data, "", {rowIndex: 1}, object, query, options, includeFunc);
  parseHeaders_(headers, data);
  transformData_(data, options, transformFunc);
  
  return hasOption_(options, "noHeaders") ? (data.length > 1 ? data.slice(1) : new Array()) : data;
}
 
function parseData_(headers, data, path, state, value, query, options, includeFunc) {
  var dataInserted = false;
 
  if (Array.isArray(value) && isObjectArray_(value)) {
    for (var i = 0; i < value.length; i++) {
      if (parseData_(headers, data, path, state, value[i], query, options, includeFunc)) {
        dataInserted = true;
 
        if (data[state.rowIndex]) {
          state.rowIndex++;
        }
      }
    }
  } else if (isObject_(value)) {
    for (key in value) {
      if (parseData_(headers, data, path + "/" + key, state, value[key], query, options, includeFunc)) {
        dataInserted = true; 
      }
    }
  } else if (!includeFunc || includeFunc(query, path, options)) {
    // Handle arrays containing only scalar values
    if (Array.isArray(value)) {
      value = value.join(); 
    }
    
    if (!data[state.rowIndex]) {
      data[state.rowIndex] = new Array();
    }
    
    if (!headers[path] && headers[path] != 0) {
      headers[path] = Object.keys(headers).length;
    }
    
    data[state.rowIndex][headers[path]] = value;
    dataInserted = true;
  }
  
  return dataInserted;
}
 
function parseHeaders_(headers, data) {
  data[0] = new Array();
 
  for (key in headers) {
    data[0][headers[key]] = key;
  }
}
 
function transformData_(data, options, transformFunc) {
  for (var i = 0; i < data.length; i++) {
    for (var j = 0; j < data[0].length; j++) {
      transformFunc(data, i, j, options);
    }
  }
}
 
function isObject_(test) {
  return Object.prototype.toString.call(test) === '[object Object]';
}
 
function isObjectArray_(test) {
  for (var i = 0; i < test.length; i++) {
    if (isObject_(test[i])) {
      return true; 
    }
  }  
 
  return false;
}
 
function includeXPath_(query, path, options) {
  if (!query) {
    return true; 
  } else if (Array.isArray(query)) {
    for (var i = 0; i < query.length; i++) {
      if (applyXPathRule_(query[i], path, options)) {
        return true; 
      }
    }  
  } else {
    return applyXPathRule_(query, path, options);
  }
  
  return false; 
};
 
function applyXPathRule_(rule, path, options) {
  return path.indexOf(rule) == 0; 
}
 
function defaultTransform_(data, row, column, options) {
  if (data[row][column] == null) {
    if (row < 2 || hasOption_(options, "noInherit")) {
      data[row][column] = "";
    } else {
      data[row][column] = data[row-1][column];
    }
  } 
 
  if (!hasOption_(options, "rawHeaders") && row == 0) {
    if (column == 0 && data[row].length > 1) {
      removeCommonPrefixes_(data, row);  
    }
    
    data[row][column] = toTitleCase_(data[row][column].toString().replace(/[\/\_]/g, " "));
  }
  
  if (!hasOption_(options, "noTruncate") && data[row][column]) {
    data[row][column] = data[row][column].toString().substr(0, 256);
  }
 
  if (hasOption_(options, "debugLocation")) {
    data[row][column] = "[" + row + "," + column + "]" + data[row][column];
  }
}
 
function removeCommonPrefixes_(data, row) {
  var matchIndex = data[row][0].length;
 
  for (var i = 1; i < data[row].length; i++) {
    matchIndex = findEqualityEndpoint_(data[row][i-1], data[row][i], matchIndex);
 
    if (matchIndex == 0) {
      return;
    }
  }
  
  for (var i = 0; i < data[row].length; i++) {
    data[row][i] = data[row][i].substring(matchIndex, data[row][i].length);
  }
}
 
function findEqualityEndpoint_(string1, string2, stopAt) {
  if (!string1 || !string2) {
    return -1; 
  }
  
  var maxEndpoint = Math.min(stopAt, string1.length, string2.length);
  
  for (var i = 0; i < maxEndpoint; i++) {
    if (string1.charAt(i) != string2.charAt(i)) {
      return i;
    }
  }
  
  return maxEndpoint;
}
  
 
function toTitleCase_(text) {
  if (text == null) {
    return null;
  }
  
  return text.replace(/\w\S*/g, function(word) { return word.charAt(0).toUpperCase() + word.substr(1).toLowerCase(); });
}
 
function hasOption_(options, option) {
  return options && options.indexOf(option) >= 0;
}
 
function parseToObject_(text) {
  var map     = new Object();
  var entries = (text != null && text.trim().length > 0) ? text.toString().split(",") : new Array();
  
  for (var i = 0; i < entries.length; i++) {
    addToMap_(map, entries[i]);  
  }
  
  return map;
}
 
function addToMap_(map, entry) {
  var equalsIndex = entry.indexOf("=");  
  var key         = (equalsIndex != -1) ? entry.substring(0, equalsIndex) : entry;
  var value       = (key.length + 1 < entry.length) ? entry.substring(key.length + 1) : "";
  
  map[key.trim()] = value;
}
 
function toBool_(value) {
  return value == null ? false : (value.toString().toLowerCase() == "true" ? true : false);
}
 
function convertToBool_(map, key) {
  if (map[key] != null) {
    map[key] = toBool_(map[key]);
  }  
}
 
function getDataFromNamedSheet_(sheetName) {
  var ss = SpreadsheetApp.getActiveSpreadsheet();
  var source = ss.getSheetByName(sheetName);
  
  var jsonRange = source.getRange(1,1,source.getLastRow());
  var jsonValues = jsonRange.getValues();
  
  var jsonText = "";
  for (var row in jsonValues) {
    for (var col in jsonValues[row]) {
      jsonText +=jsonValues[row][col];
    }
  }
  Logger.log(jsonText);
  return JSON.parse(jsonText);
}

You can customize this script or write your own one in App Script or Python. Meanwhile, you can save time by using Coupler.io, a solution that lets you export GitHub to Google Sheets with just a few clicks. This is what Sarah will go with to meet her needs.

Do you need a ready-to-use GitHub to Google Sheets integration?

Although Sarah spent just a few minutes connecting GitHub to Google Sheets, she had to check out GitHub API documentation and tinker with pagination and some other things to make it work. With other integrations provided by Coupler.io, the setup mostly consists of selecting values from dropdown lists. So, Sarah would be happy to have the ready-to-use GitHub to Google Sheets integration by hand. If you need this as well, let us know by filling out this brief questionnaire. Good luck with your data!

  • Zakhar Yung

    A content manager at Coupler.io whose key responsibility is to ensure that the readers love our content on the blog. With 5 years of experience as a wordsmith in SaaS, I know how to make texts resonate with readers' queries✍🏼

Back to Blog

Comments are closed.

Focus on your business
goals while we take care of your data!

Try Coupler.io