Are You Losing Time and Budget in the Bermuda Analytics Triangle?

In this edition of our blog we take a look at where your time, effort and budget ultimately goes in Data Analytics. We thought we would start with a quiz:


Where is the biggest proportion of your Data Scientist time spent? 

A) Changing the world by building Predictive Models using Machine Learning algorithms?
B) Developing breakthrough code to mine valuable Business Insight in seconds?
C) In meetings with stakeholders planning the future and how to conquer it?

These are all worthy pursuits, but actually more often than not we find it’s hidden answer D) Data Engineering or Data Management and BI.

A far “less sexy”, apparently, yet an inevitable, part of the job and often the first step many organisations struggle with. It’s not unusual for up to 80% of time and effort on a project to be spent first getting the data right. While this is a necessary first step to ensure the analytics and reporting you produce drive accurate and valuable insights for your business, unnecessary time is spent dragging the numbers around data handling software (DB, SAS, Python, R, Hadoop, etc) manipulating in Excel and presenting in PowerPoint for the information to be brought to life.

So what’s the issue? 

At this point, we can already hear some of you saying, “What’s the issue, it’s what Data Scientists are paid to do, sometimes.”In fact, we see it all too often as a significant and unavoidable problem. Data Scientists are getting paid to spend 50% of their time doing a task which is often ineffective and unnecessarily repetitive. We estimate that an average analytics team in a big company has around 50 employees, each working 20 days per month for 7.5 hours a day. This equates to 7,500 working hours per month available. So, if 50% is spent on inefficient reporting practices, that equals 3,750 working hours that could be utilised elsewhere… Every. Single. Month!This inefficiency stems from the fact that the majority of organisations base their reporting on what we have dubbed the “Bermuda Triangle” model of (1) Data handling software – (2) Excel – (3) PowerPoint.

The process is simple, and worst of all, linear:

  1. The numbers, through ETL (Extract, Transform, Load), are created in a DHS then
  2. hey are exported to Excel where graphs are created and
  3. finally the graphs and tables are used and commented upon in a PowerPoint presentation.

This works perfectly fine when there are no changes to initial assumptions. In reality there are always changes being made to the initial assumptions. Under this system the slightest change will cause the final visual to be incorrect. This forces analysts to go back through the triangle manually again and again to ensure accuracy. This is when we deem them to have been lost to the Bermuda triangle.

[km-cta-block block-classes=”has-dark-teal-background-colour has-white-colour” label=”Contact us to discuss your requirements”]

Want to know more?

Our data experts would love to hear from you

[km_button link=”https://www.dufrain.co.uk/contact/” classes=”cta-2″]Contact us[/km_button] or [km_button link=”tel:08001303656″ classes=”cta-2″]Call us on 0800 130 3656[/km_button][/km-cta-block]

One Dufrain Solution 

The team at Dufrain have made it our mission to claim back this 50% and spend it on actual, productive analysis work. Trying to eliminate, when possible, moving data around the “triangle”. This way we spend more time on the analytic elements of a project, produce better results and create a higher return on our clients investment.

How do we do that?

One solution is to utilise tools like Jupyter and coding such as Python or R which can handle all these processes at once. In practice this means that you upload the data, undertake your analysis and produce your end tables and graphs under one program. Since the coding process generates output automatically, there is no need to perform multiple medium manipulations in order to render the chart. In addition to this, the live output of the data allows you to easily identify and solve any issues with limited effort. So, when the time comes that the underlying data, the analysis, or the chart needs to be tweaked, it does not present an administrative burden.

An all in one approach is more efficient to use and can help data analytics teams to get the engaging visualisations required, with a fraction of the effort wrangling data around the Triangle.

This is a common and growing challenge for organisations looking to improve their Data Science capability. In order to attract, retain and exploit the benefits of Analytics and Data Science, organisations must find a way to ensure 50% of a Data Scientists time is not spent on ineffective and unnecessarily repetitive tasks. Not only is this a waste of time and budget, but will lead to high staff turnover when Data Scientists leave for roles where they can spend more of their time on high value data analytics challenges.
With all this being said, the key question and exciting opportunity is: What could your analytics team achieve if they had 50% more time on their hands?

To find out more about this approach or other approaches we can take to help you with your data processing and visualisation, simply email us at enquiries@dufrain.co.uk to start the conversation.

[km-cta-block padding=20 block-classes=”has-dark-teal-background-colour has-white-colour” label=”Contact us to discuss your requirements” ]

Find out which approach works best for your data

Simply email us to start the conversation

[km_button link=”https://www.dufrain.co.uk/contact/” classes=”cta-2″]Contact us[/km_button] or [km_button link=”tel:08001303656″ classes=”cta-2″]Call us on 0800 130 3656[/km_button][/km-cta-block]