Toggl to BigQuery

This page provides you with instructions on how to extract data from Toggl and load it into Google BigQuery. (If this manual process sounds onerous, check out Stitch, which can do all the heavy lifting for you in just a few clicks.)

What is Toggl?

Toggl offers offers online time tracking and reporting services through a web interface and mobile and desktop applications.

What is Google BigQuery?

Google BigQuery is a data warehouse that delivers super-fast results from SQL queries, which it accomplishes using a powerful engine dubbed Dremel. With BigQuery, there's no spinning up (and down) clusters of machines as you work with your data. With that said, it's clear why some claim that BigQuery prioritizes querying over administration. It's super fast, and that's the reason why most folks use it.

Getting data out of Toggl

Toggl has a couple of APIs that developers can use to interact with the platform. The Reports API lets you get information out. For example, to retrieve a summary report, you could call GET "https://toggl.com/reports/api/v2/summary?workspace_id=123&since=2018-12-19&until=2018-12-20&user_agent=api_test".

Sample Toggl data

Here's an example of the kind of response you might see from a query like the one above.

{
    "total_grand":36004000,
    "total_billable":14400000,
    "total_currencies":[{"currency":"EUR","amount":40}],
    "data": [
      {
        "id":73569,
        "title":{"project":"Toggl Desktop","client":"Toggl"},
        "time":14400000,
        "total_currencies":[{"currency":"EUR","amount":40}],
        "items":[
          {
            "title":{"time_entry":"Implementing some important things"},
            "time":14400000,
            "cur":"EUR",
            "sum":40,
            "rate":10
          }
        ]
      },{
        "id":193009951,
        "title":{"project":"Toggl Development","client":null},
        "time":14400000,
        "total_currencies":[{"currency":"EUR","amount":0}],
        "items":[
          {
            "title":{"time_entry":"Hard work"},
            "time":14400000,
            "cur":"EUR",
            "sum":0,
            "rate":50
          }
        ]
      },{
        "id":null,
        "title":{"project":null,"client":null},
        "time":7204000,
        "total_currencies":[],
        "items":[
          {
            "title":{"time_entry":"No title yet"},
            "time":1000,
            "cur":"EUR",
            "sum":0,
            "rate":50
          },{
            "title":{"time_entry":"Did nothing"},
            "time":1000,
            "cur":"EUR",
            "sum":0,
            "rate":50
          },{
            "title":{"time_entry":"Hard work again"},
            "time":7202000,
            "cur":"EUR",
            "sum":0,
            "rate":50
          }
        ]
      }
    ]
  }

Preparing Toggl data

If you don't already have a data structure in which to store the data you retrieve, you'll have to create a schema for your data tables. Then, for each value in the response, you'll need to identify a predefined datatype (INTEGER, DATETIME, etc.) and build a table that can receive them. Toggl's documentation should tell you what fields are provided by each endpoint, along with their corresponding datatypes.

Complicating things is the fact that the records retrieved from the source may not always be "flat" – some of the objects may actually be lists. In these cases you'll likely have to create additional tables to capture the unpredictable cardinality in each record.

Loading data into Google BigQuery

Google Cloud Platform offers a helpful guide for loading data into BigQuery. You can use the bq command-line tool to upload the files to your awaiting datasets, adding the correct schema and data type information along the way. The bq load command is your friend here. You can find the syntax in the bq command-line tool quickstart guide. Iterate through this process as many times as it takes to load all of your tables into BigQuery.

Keeping Toggl data up to date

At this point you've coded up a script or written a program to get the data you want and successfully moved it into your data warehouse. But how will you load new or updated data? It's not a good idea to replicate all of your data each time you have updated records. That process would be painfully slow and resource-intensive.

The key is to build your script in such a way that it can identify incremental updates to your data, using datetime request parameters like since and until to identify records that are new since your last update (or since the newest record you've copied). Once you've take new data into account, you can set your script up as a cron job or continuous loop to keep pulling down new data as it appears.

Other data warehouse options

BigQuery is great, but sometimes you need to optimize for different things when you're choosing a data warehouse. Some folks choose to go with Amazon Redshift, PostgreSQL, Snowflake, or Microsoft Azure SQL Data Warehouse, which are RDBMSes that use similar SQL syntax, or Panoply, which works with Redshift instances. If you're interested in seeing the relevant steps for loading data into one of these platforms, check out To Redshift, To Postgres, To Snowflake, To Panoply, and To Azure SQL Data Warehouse.

Easier and faster alternatives

If all this sounds a bit overwhelming, don’t be alarmed. If you have all the skills necessary to go through this process, chances are building and maintaining a script like this isn’t a very high-leverage use of your time.

Thankfully, products like Stitch were built to move data from Toggl to Google BigQuery automatically. With just a few clicks, Stitch starts extracting your Toggl data via the API, structuring it in a way that is optimized for analysis, and inserting that data into your Google BigQuery data warehouse.