Monday, October 22, 2007

Architecture reviews

If your company has a mature project management process, than this process includes a regular review (decision gates). It is very useful to ride on the back of this process when establishing architecture. If an architecture review, based on some clear criteria is part of these decision points, then it is likely there is more attention for architecture.

The other upside is that the architects will have a higher chance of getting involved in the projects - know what is going on - know what works and is required - and do some marketing for the architecture related ideas.

Labels: ,

Sunday, October 21, 2007

Gap analysis

Data management can be a bit daunting sometimes (there is so much to do, so little time, and nobody really supports it). So using an architectural approach can be counter productive (too many issues, too many fluffy slides) and turn people off in their drive for improvement.

In this case it can be more useful to have an approach that has the focus on addressing the major gaps and clear deliverables. This approach can make use of the architecture patterns, which I posted earlier. The patterns can identify very quickly components that are missing. Say you have a transaction environment and no business intelligence solution - than this is quite likely a gap (etc), so something that may be a pain point that needs addressing.

If this is still too complex (the patterns can be complex to understand, because people mix up physical data stores with 'roles'), then the last resort is to use a simple set of interviews to create a heat map of where people experience pain with their IT systems. This pain is usually caused by the usual data management problems (no ownership, no clear rules & enforcement for quality, spaghetti integration, no standards, etc.). So the trick is to link the pain from the interviews to some root causes and then create a simple staircase diagram addressing the key gaps in a logical sequence. If you cannot define these clear deliverables that link to existing projects or programs, then it is very likely you will fail with any architectural attempt.

Labels:

Thursday, October 11, 2007

Data Management in de Project Lifecycle

Project managers give me a surprise on a regular basis. I know that the main aim of their work is to deliver what was agreed, but that does not dismiss them from responsibility for the quality of the deliverable.
It may be a bit of a rough remark, but most project managers have a 'tunnel vision'. The project needs to be delivered, on time, on budget. But this can be bad for data management


So what do I recommend to project managers before they begin?
  • Do a lot of 'front-loading' - i.e. before the project begins it is important that the DM deliverables are defined and that the planning takes care of delivering them during the solution architecture phase (the high level design) and beyond. If you miss them at an early stage - it is hard to recover ...
  • Ensure that you as a project manager also understand the project in its 'data context'. This includes understanding the related other systems, but also responsibilities like data ownership
  • Agree resources for taking care of the data management element. Data architects, or data analysts are useful resources
And then during the project it is a matter of ensure that the scope - agreed during the front-loading - is delivered!

Labels: ,

Monday, October 08, 2007

Being everything to everybody

One of the regular mistakes I see on an almost day-to-day basis is the aim of many systems to be everything to everybody. A user always wants more and that's a nice trigger for vendors to let their systems grow.

Usually packages originate from a single functional requirements and over time more and more gets added to the scope. In the background there is also this drive related to 'if you have a hammer, than everything looks like a nail' (that's also why SAP systems grow beyond what they were intended for ...).

One of the key roles of an architect is to keep the scope of systems close to what they were intended to and avoid the scope-creep causing overlapping functionalities. It is not always a popular message, but it avoids a lot of spagghetti and wildgrowth. At the same time it is important to be not too harsh - sometimes new paradigms can flower under the wings of something else and can suddenly create a lot of value. Just like the Internet was never designed for what it's used for today ...


So how to keep this balance? I guess it is at the end of the day all a matter of judgement - some flexibility at one side (let the flowers bloom) and some rules at the other (start pruning when needed). There are no simple answers in architecture!

Labels:

Wednesday, October 03, 2007

Data management and Web2.0

After a few months of data architecture I notice that my original thoughts of using the paradigms of Web 2.0 have moved slowly a bit more to the background. Still I have to remind myself how we can move away from 'old' centralistic thinking to enabling the 'new' collaborative content creation model.

I think there are several roles for data management in enabling this transaction, but the more I think about it, the more I know that this is not a simple journey.

First of all I think that data management can play a major role in establishing a services oriented architecture that will move the complexity of data storage and data integration towards the background. If the provision of quality data becomes like infrastructure, than the usage will become more natural and will enable easier collaboration. But this requires a lot of work related to moving the complexity to the background and establishing good old data mgt processes for creating proper quality data in the right master source (so not really webby2.0)

Second - I need to fight for opening up the management of Meta data to the 'masses'. Why not publish this as a wiki? I have already started piloting this with our main data management principles (and soon the architecture principles should follow as well) and for me it already starts to work. A wiki is relatively easy to maintain, it can link to anything and it is very accessible. On the first point (maintenance) I see one downside - it is not easy to automate. Some structured content is just easier maintained in a database. Maybe this is an idea for a further development of the wiki (a wikidb based on MySQL, where structure content like standard picklists can be managed). On the second point (linking) I only see upsides - a lot of content I do not have to invent myself - it is already there! and this also helps the accessibility. Many anti-wiki people fear for the lack of 'control' of a wiki, but I see that the audit trail feature and the 'official' nature of the publishing mechanism helps to keep the content professional.

Another thing worth pushing is the use of Blogging and Indexing in environments where data is interpreted. Should we really keep an audit trail of everything if users can keep a Blog of what they've done and Indexing provides the basis of search and ranking mechanisms towards the right content? Should we introduce Search on top of structure data?

Finally I think that the idea of collaboration environments are a good principle for many data management paradigms. Maybe not for transaction processing (that's all about control), but for interpretation environments, research & development contexts and engineering it makes sense to organise information more related to workgroups and less to 'corporate control'.

Labels: , ,