When a company is young and there aren’t many customers and not much money is being spent on marketing, it is easy to manage by just going over to Google Analytics to see how much traffic your site has received.
However, as time passes and new data sources are added, reporting, especially for marketing analytics becomes a cumbersome and repetitive process.
It isn’t uncommon for a weekly marketing report to take upwards of three hours to prepare when the person preparing the report has to manually go to each data source, apply date filters, download csvs, filter the relevant sections, and finally enter the data.
When you do the math at just 3hrs per week, your marketing team will have spent 156 hours preparing reports. When you divide that to 8 hours, we can see that 20 complete workdays are spent on that single report in a year. Using people to do this task manually is too big of an investment.
The worst part is, because this amount of time gets spent preparing the report, there is barely anytime left to analyze it.
How to automate your reporting:
The good news is, you can easily automate your reporting process by writing a script to do it for you. I mainly use python for working with APIs and web scraping.
Create a list of data sources you will need
The first thing I do before automating a process is thinking of exactly what I want my program to do. Create a list of which data points you need and which sources they are at. Take notes of how you are going to manipulate that data for your report.
Find if they have an API available
Many of the online ad platforms have an API, although you might need some help from your company admins for these.
- Google Ads
- Bing Ads
- Google Analytics
All of these platforms offer API connections that you can use to get the data you will be using in your reports. Even though it can take some time to configure the setup, it makes sure that your data will be consistent and reliable.
Scrape if you need to
Although APIs are the first choice when we need to get data from another platform, when they are not available you can use something like Selenium to emulate your actions on the browser. Basically, it allows you to write a program that would act like you were controlling your web browser.
Export to excel
Once you have gathered all your data from different sources, you can use the Pandas module to prepare your report, apply the changes you want to the data and easily save it as an Excel file.
Revise and update
As new data sources are added or you need new fields in your reports, you will need to revise your code and make sure that it works correctly. Knowing that you will probably need to make changes in the future, make sure to comment your code and structure it accordingly.
Although it will take some time to automate the whole process, once you have a report that used to take hours to prepare ready within seconds, you will never go back to the old way.
If you’re interested in learning more about automating reporting processes get in touch.