Saturday, June 23, 2007

Data Management Fundamentals

As said before Data Management is about control, control on quality, control on replication of information. Data Management is very much part of the plumbing of an Information Architecture and therefore people tend to give a little attention. Remember - At the end of the day users want to use data in applications, but it is just the applications that they see. And therefore the easier it is for them to get the right data, the better it is. They don't care how we get it to them, as long as it is easy to get and easy to understand. If it is good data than that's a bonus, but if bad data is easier to get than good data, than it is more likely that users will choose the former. If you use Google would you browse to page 25 to get the right result out of your search? No.

So what does this tell us? What it tells me is that Data Managers have to provide a platform which is like using electricity. We use a switch and we have light. So users should get their data served to them automatically (i.e. the data is already replicated to the application and there is nothing to worry about), or should be able to get it very easily; in one place, in the right format and with the right quality - without ambiguity.

In many administrative environments this is already a reality - integrated back office systems have been rolled out over the last decade and people cannot even remember anymore that they had to go through different systems to get something ordered and then an invoice paid. Note that some companies still struggle with their master data, but that's more a flaw of SAP, than a lack of intergration and standardisation.

If we talk about the more complex industries like manufacturing, construction or oil & gas, than we see that we are clearly not there yet. And this is where Data Management really becomes interesting. Quite often data is held in many systems and replication is ad hoc, manual, and clearly not perfect and this is where a lot of progress can be made by applying a ten simple architectural measures:

1) Agree the scope = the master reference data; i.e. data shared across systems

2) Agree who decides about the data in scope

3) Agree the standard for the master reference data

4) Agree the master source for this reference data

5) Agree basic quality rules

6) Agree how the information is replicated (format, frequency)

7) If more than one version of the same data should exist, agree how to deal with versioning

8) If users have to shop for the data themselves, than ensure it retrieval is in one place

9) Automate as much as possible, and keep it simple

10) Ensure there is a data helpdesk in place


This is all no rocket science - but in most large organisations this is a struggle!

Labels: ,

1 Comments:

At 1:28 AM, Blogger Unknown said...

To support continuous innovation, holistic design and networked collaboration data management has to become part of knowledge management.
Knowledge management requires a holistic design approach where role-structures are designed in concert with product, processes and system services.
This is achieved by developing what we term Active Knowledge Architecture.

 

Post a Comment

<< Home