By @mhawksey

Collect/backup tweets in a Google Spreadsheet [Twitteralytics v2]

*** NEW VERSION – I’m keeping this post up for reference. You should get the very latest version TAGS v3 HERE! ***

Back in June I hastily threw together Using Google Spreadsheet to automatically monitor Twitter event hashtags and more. It kind of worked most of the time but there were a lot of rough edges. So here is:

*** TAGS v2.4 Google Spreadsheet ***
BTW Here’s How to archive event hashtags and create an interactive visualization of the conversation

Sticky: Looks like Google Spreadsheet is misbehaving when copying the template. If the Twitter menu isn’t appearing even when you click the button check that when you open Tools > Script editor … that there is more code than just function myFunction()... (this indicates that Google didn’t copy the Spreadsheet template properly for you). If you just see function myFunction then:

  1. try recopying the template until it works; or
  2. copy and paste the code here into the Script editor and save it

If you are still having problems leave a comment 😉
Update: I recently added the option to collect results in a single sheet. Update is included in link above. More info about this here.
Update2 (v2.3): Changes to get real user ids and entities (some other easter eggs too). Update is included in link above.More info about this here.
Update3 (v2.4): Changed code slightly to play more nicely with Twitter Search API
Update4 (v2.4.4): changed wording for weekly/daily and added option to get tweets from specific day within last 7 days

Here is a short video showing setup:

The concept is the same, you setup a search and Google Spreadsheet updates the results using a time driven trigger. The main difference is where the old version just collected date, author and tweet text this version has the option of pulling everything back available from the Twitter GET Search API. Even better you define what a where information is pulled by matching your column heading name with the Twitter search result schema, so if Twitter adds more data it’s easy for you to pull (one small caveat is it can only pull top-level data, anything nested in the results is lost apart from geo-tagging, which I thought was too useful to loose).
This was easily achieved by modifying Mikael Thuneberg getTweets() function which I originally used in version 1. This function collects search results using XML/ATOM. To achieve more flexibility I converted this to use the JSON results from Twitter. Because Google App Scripts are written using a JavaScript syntax it makes it easy to through the JSON results around and given that there is already handy bits of code for writing spreadsheet data using JavaScript Objects I was able to achieve more flexibility easily.
As my focus this time around is to just harvest tweets I dropped some of the original functionality of summarising ‘top tweeter’ and emailing results, but it’s easy enough to drop you own functions or gadgets to help you process the data. For example you could use the Map Gadget to plot geo-located tweets. This spreadsheet has some examples of what you can do with the data.
The from:briankelly sheet includes a map gadget whilst the first 5 columns of #purposed contain functions for extracting top tweeters and a Google Spreadsheet readable timestamp.
With services like and downloadable apps like ThinkUp why have a Google Spreadsheet version? Not entirely sure I’ve got a good answer to that one. The biggest advantage is maybe its a quick way to collect tweets, make publically available and collaborate exploring the data.
Mainly for SEO here is the code for the modified version of getTweets(), the rest of the script is available in the spreadsheet.

function getTweets(searchTerm, since, until, maxResults, languageCode) {
    //Based on Mikael Thuneberg getTweets - mod by mhawksey to convert to json
    // if you include setRowsData this can be used to output chosen entries
    try {
        var pagenum = 1;
      var data =[];
      var idx = 0;
        if (typeof maxResults == "undefined") {
            maxResults = 100;
        if (maxResults > 1500) {
            maxResults = 1500;
        if (maxResults > 100) {
            resultsPerPage = 100;
            maxPageNum = maxResults / 100;
        } else {
            resultsPerPage = maxResults;
            maxPageNum = 1;
        Logger.log(twDate(since)+" "+twDate(until));
        searchTerm = encodeURIComponent(searchTerm);
        for (pagenum = 1; pagenum <= maxPageNum; pagenum++) {             var URL = ""             URL = URL + "?q=" + searchTerm;             URL = URL + "&since=" + twDate(since);             URL = URL + "&until=" + twDate(until);             URL = URL + "&rpp=" + resultsPerPage;             URL = URL + "&page=" + pagenum;             URL = URL + "&result_type=recent";             if (typeof languageCode != "undefined") {                 if (languageCode.length > 0) {
                    URL = URL + "&lang=" + languageCode;
            var response = UrlFetchApp.fetch(URL, {method:'get', headers: { "User-Agent": REFERER}});
            if (response.getResponseCode() == 200) {
              var objects = Utilities.jsonParse(response.getContentText()).results;
              for (i in objects){ // not pretty but I wanted to extract geo data
                if (objects[i].geo != null){
                  objects[i]["geo_coordinates"] = "loc: "+objects[i].geo.coordinates[0]+","+objects[i].geo.coordinates[1];
                objects[i]["status_url"] = ""+objects[i].from_user+"/statuses/"+objects[i].id_str;
                idx ++;
    return data;
    } catch (e) {
        return e.message;
Exit mobile version