How we use incremental modelling to handle billions of events every day 🚀

Hi there :wave:

I’m Jake, a Senior Analytics Engineer here at Monzo. Our team is responsible for creating a best-in-class data warehouse, designing and building data models to support our wider data team of data analysts and scientists in generating the powerful insights that drive Monzo’s growth.

We’re also responsible for ensuring our data pipeline is highly performant, identifying opportunities for optimisation and implementing more scalable modelling methods. It’s this side of the job which is the topic of my latest blog post, which discusses how we gain significant performance gains within our pipeline by embedding incremental modelling into our ways of working :rocket:

If you have any questions about analytics engineering - or data in general - at Monzo, please fire away!


As someone who’s recently started working with dbt, this was a good read. I’m going to add a regular full refresh to my incremental models right now…


Thankyou Jake - I enjoyed the article, although I did need to re-read some sections.

This was not due to any lack of clarity but entirely due on my part to a lack of understanding, as such things have never really been part of my world, either professional or personal.

And that’s exactly why I really appreciate articles like this from the team. Please keep them coming.


Thank you - really glad to hear you enjoyed it and please don’t hesitate to ask if you have any questions! :smiley:

1 Like