The good & bad of BI maturity models

I wanted to discuss two things that get used a lot in Data & Analytics, although I’m sure they exist in many other areas.

  • Maturity models – aka capability models
  • Maturity assessments – aka capability assessments

Maturity models are just a static image; a diagram explaining the spectrum of what can be done within a particular discipline.

Maturity assessments are surveys that require you to answer a range of questions about your organisation, they then generate a document/report that tells you your areas of strengths and weaknesses with some suggestions on which areas you should focus on first.

Some examples

Above, an example Maturity Model.

With this particular example, my opinion is that the exponential curve is inverted. For most organisations, moving from “1.0 Frustrated” to “2.0 In Control” is where you will see the biggest jump in “competitive advantage”.

Above, a screenshot of an example DWH Maturity Assessment report from Google.

The good and bad…

First of all, I’ll start with my conclusion:  Capability Models & Assessments can be really useful tools BUT can do more harm than good. Here is why I think that…

The good

Maturity models can really concisely and effectively communicate common patterns and problems.  A lot of time and effort have clearly gone into making them and often show a good understanding of the industry.

They provide a great starting point for teams wanting to discuss strategy.

Capability Assessments can pose questions that make you reflect and review what you currently have.  If you take your time, you can reason through the output and cherry-pick some possible next steps.

Below is an example of one of the better models:

The bad

Maturity models infer that “left is bad, right is good” – the question of quality never comes into it, nor does the philosophy of continuous improvement. 

Take a situation where you have a Data Warehouse that takes 23 hours to load, the data is wrong and its costs a bomb to operate…according to these techniques, none of that matters because that step is done and so its ‘on to the next one’.

For example, compare the following two teams:

Team A - A large, expensive team producing ML models with data sitting in a data-mesh architecture. The output is either not used or they don't actually create better outcomes for the company. 
Team B - A small team of data developers producing simple but high-quality reporting and automating data processing tasks using a relational database. 

If you don’t have a really strong grasp and experience of the data industry then using these maturity models/assessments will lead you to the conclusion that Team A are much “more advanced” and are “performing better” than Team B.

Some leaders may not take their organisations’ individual circumstances into account and look to emerging technologies as silver bullets.  Scenarios arise where great data teams who are a key part of the business can all of a sudden be perceived to have not moved with the times and who have let the competition gain an advantage.

The assessments can also result in FOMO (fear of missing out) and knee-jerk reactions. For example, hasty and speculative new projects may get spawned with little or no ROI; anecdotally I have heard this about some Data Lake projects.

How they can be misused

A feature of BI Maturity models is that they oversimplify, to fit one-page they have to.

This is not a problem if the people using them understand the domain really well as it can be used as more of a discussion prompt.  But for people who only have a very high-level understanding, they suddenly believe that the road-map to great data is simple… move from left to right and tick everything off on the list.

Maturity assessments/models are can also be used by vendors to ‘neg’ the organisation’s in-house team.  ‘Negging’ is a technique where criticism is used to erode someone’s confidence, making them more open to suggestion.  The criticism being cast doesn’t have to be valid or relevant, it is just used to persuade others that there is a problem where one didn’t exist before… and that the consultants have the answer.

Capability Assessments in particular are used by consultancies. When first engaging with them, you are often asked to complete their own bespoke surveys. If you answer honestly, then the results may be used to try and knock business leaders faith in their existing teams and to claim that their consultancy has much greater capabilities.

Summary

To recap my thoughts in 3 points:

  1. BI Maturity Models/ Assessments can be useful tools
  2. Some are good, some are pretty useless
  3. Understanding how they can be misused will prevent it happening to you and your team

ITIL : A view from the Trenches

A while back I stumbled across a great blog post from a guy called Greg Ferro with his musings on ITIL. It’s a very good read in its own right and can be found here.

Unusually however, the pages of discussion and comments underneath the article, arguing both for and against ITIL, contained just as much insight as the main article itself. A lot of people felt so passionate about the subject that they took the time to relay their own personal experiences in a really considered and articulate way.

This slideshow below is just a collection of some of the best comments against ITIL – I’m not trying to be balanced here. They are largely posted ‘as is’, but some comments have been combined on the same slide just to use the space wisely.

Btw, I’m not personally commenting on ITIL’s merits – but as ITIL is seemingly ubiquitous in organisations everywhere, it is rarely challenged and doesn’t seem to have a feedback loop – which is one of the reason’s I found the post and comments so interesting.

So here you go…

rightly or wrongly, a lot of people feel this way about ITIL

Assess your dev teams with the Ben Test

Introduction

The original Joel Test is a work of genius – it allowed you to assess the quality of your development teams in under 5 minutes.  By answering the yes/no questions, it steered you into thinking through each of the points and understand weak points in your process.

It was written in August 2000 (that’s before Windows XP was even released) and is still very relevant today, which suggests that a lot of IT teams are still making the same mistakes they were making 18 years ago.

I’ve heard one developer sum up the Joel Test perfectly when he said, “the beauty of the Joel Test is its simplicity vs its effectiveness”.

Just as the original did, these tests deliberately cover the basics. I have intentionally not included tests such as “are you agile?” or “do you do DevOps?” as a lot of people misunderstand these concepts and unfortunately they sometimes get thrown around as management buzz-words.

Just for the record, I have worked mainly as a data warehouse & BI developer for financial services firms in London, within IT teams ranging from 2 people to 100s; if I had spent my career working as a web developer for Facebook or Amazon then I’m sure my tests would be quite different.

TheBenTest_

Remember, Yes = 1 point.  No = 0 points.

11 or 12 points

Keep doing what you’re doing

8, 9 or 10 points

Keep going but analyse where you can gain efficiencies

7 or under

Stop what you’re doing, call a team meeting and make a plan

 

01 – Does your team have a goal… and do all your team members know what it is?

Do your team members also know what their individual goals are and those of the company?

This is universal and applies to any team/individual and is not really about having goals – which all companies will have – it’s about whether or not they are being communicated down the layers and making sure everyone is striving for the same success.

Benefits of having clearly understood goals

  • Much more cohesion between teams when people pull in the same direction.
  • Goal-relevant activities take up a lot more of the overall effort than goal-irrelevant activities.
  • Managers cannot motivate teams by assigning out a series of seemingly unconnected tasks.

Continue reading

An Example Tableau Security Model

My experience navigating Tableau security as a novice…

I recently upgraded a Tableau 10.1 estate to Tableau 2018.1.  I used the opportunity to completely rework the security from the ground up.

When starting out, much of the guidance I found on the net was focused on the many individual components that make up a Tableau estate.

While I’m certainly not claiming what I have done is best practice, I hope it will trigger some ideas and serve as a starting point for your own implementations.

This guide doesn’t cover licensing although that is something which is definitely worth understanding if you are implementing a Tableau security model.  If you need to learn about Tableau licensing then this article does a great job of explaining both the old and new models.

Continue reading

Top 10 concepts from Netflix’s culture of ‘Freedom and Responsibility’

Back in 2009 Netflix released a slide deck called ‘Freedom & Responsibility’ that explained some of their strategy and culture.

Facebook COO Sheryl Sandberg said that it “may well be the most important document ever to come out of the Valley”.

I first heard about it a few years back when I read Dave Coplin’s brilliant (and succinct) book Business Reimagined, which you can download here for free.

‘Freedom & Responsibility’ was something that evolved at Netflix over many years.  Here are what I consider to be the 10 most notable ideas from the document.

Continue reading

Software Development & Broken Window Theory

broken_window

Broken Window theory goes something like this:

  1.   Some broken windows are left unrepaired in a neighbourhood…
  2.   People see this state of disrepair and feel like no one cares about their surroundings…
  3.   Because nobody cares, people feel like they can cause further damage without repercussion…
  4.   Further damage is done, perpetuating the cycle.

We see this exact same pattern in all areas of life including software development.

Continue reading

SSRS Standards

When I joined my current employer, they had just started an informal project with 6 BI developers to deliver a host of new reports against their new data warehouse using SSRS 2016 Standard.

These are the standards I put together to try and bring some consistency to both the reports and the environments.

Like with any standards, my goal was to address the main areas whilst not being too onerous by concerning ourselves with minute details.  It was also to provide a steer on items which end up being arbitrary decisions such as the names of data-sources.

Hopefully it can act as a starting point for your own organisations.  I’ll look to cover the rationale behind the access model in another post.

The document, which also contains details on how to setup the project configurations can be found here.

SSRS Standards

Related Links:

How to enable Remote Errors in SSRS

How to set retention period of the execution log

Data Dictionary

Data Dictionary w/ Search Functionality (2016)

At CNA-Hardy, I put together a data dictionary for an Underwriting/Actuarial MI system I looked after.

I created the data dictionary in Excel and put a search facility built in.  With around 250 calculations and attributes it made understanding and troubleshooting a lot easier.

data-dictionary-img

The .xlsx version of the tool can be seen here.

The .xlsm version which also filters the rows can be found here.

 

Technical Debt

Technical debt is a metaphor that equates software development to monetary debt.  In my opinion it is one of the most crucial concepts to be aware of when planning projects or road-maps.

Imagine that you have a project that has two potential options; one is quick and easy but will require modification in the future, the other has a better design but will take more time to implement.

In development, releasing code with a ‘quick-and-dirty’ approach is like incurring debt – it comes with the obligation of interest, which, for technical debt, comes in the form of extra work in the future.

Just like monetary debt, technical debt is interest-bearing and compounds.  You always have the option to pay down the debt (long-term thinking) or to take out additional credit (short-term) but your project can become insolvent where the only option is to write-off the debt (re-write from scratch).

To summarise, it is a debt that you incur every time you avoid doing the right thing like removing duplication or redundancy.  It will add an overhead to everything you do from thereon in, whether that is troubleshooting, maintenance, upgrading or making changes.

[Some parts taken from MartinFowler.com and Techopedia]

Default SQL Server Settings

Introduction

You can change server-wide default settings by using facets and properties at server level or by modifying the model system database (which acts as a template for new databases created).

I don’t change much but here are the settings I do change to make life easier for myself.

Note: I fully understand that everything needs to be considered on a case by case basis – I’m just presenting these as possible ideas.


#1 – Backup locations – (In SSMS object explorer, right-click the server >> Facets >> Backup Directory)

When configuring my backup procedures, I like to setup two folders on each database server such as the below;

E:\SQL_Backups_Nightly\ | Used with nightly backup maintenance job.
E:\SQL_Backups_Adhoc\   | Used for manual backups/restores.

I then setup the latter as my default backup directory in the server facets pane.

This means that when I do manual backups/restores the dialog box will take me straight to this folder.  I find that when restoring a database under pressure, trying to navigate to the default paths (which usually look like below) just adds unnecessary confusion.

\\Servername\e$\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\Backup

#2 – File Locations – (In SSMS object explorer, right-click the server >> Properties)

This is similar to the backup locations above.  In Database Settings under Database Default Location I use the following values.

E:\SQL_Server_Data\
F:\SQL_Server_Logs\

Again, they are just easier to find and if you ever have to write MOVE statements and such like, this will make it a lot easier.

For the record, it’s a good idea to have your log files on a separate physical disk to your data files, please see this article for a full explanation.

Continue reading