Posts Tagged ‘architecture’

Dialogue over Diagrams

April 20, 2010 Leave a comment

In thinking how Agile changes the role of an Architect, I keep thinking about the Agile Manifesto Axioms or preferences.  For the Agile Architect, I think we need to add a new one:

Dialogue over Diagrams

As with the manifesto, we’re not saying that diagrams are  no longer required.   A diagram is still worth a thousand words.  Also, what would we do with our Visio and UML Tools?

The point of Dialogue over Diagrams is not to say we don’t need diagrams.  The point is that, as Architects, we need to remember that the Diagrams are not the final objective.  Working Software [products] is the final objective.  Working software comes when there is a common understanding between owners and builders of what needs to be built and how it needs to work in order to satisfy the vision of the owner.

I believe the best and fasted way to achieve common understanding is through dialogue.  The diagrams are still necessary, but they are no longer your primary focus.  They are enablers. Read more…


Starting The Agile Architect

April 16, 2010 Leave a comment

What’s an architect to do when the team goes Agile?

The Agile Architect is a project I’ve started with with Peter Wennerstein of Icon Innovations.   The inspiration came when the leaders on the project I’m working on decided we were going Agile and sent everyone to Scrum training.   The Vision is to give guidance and encouragement to all the architects out there who want to be a valuable part of Agile projects.  I’m putting together my experience as a leader in Software Development and long time Solutions and Enterprise Architect.  I’m teaming up with Pete because of his passion and talent for communicating with images, icons and multi-media.  I think we’re going to have a lot of fun with this and hope it really helps you on your journey down the path of becoming an Agile Architect.

If you are interested in this topic, please leave a comment to encourage me in this projects.


Read more…

Granting a Monopoly to your IT Suppliers

August 28, 2009 Leave a comment

In Enterprise Architecture we are often fixated on standardization of technologies across our enterprise.  We publish standard technologies for services such as DBMS, Storage, Network as well as application services like workflow, reporting and reconciliation.

The rationale for standardization is pretty straight forward.  Diversity is costly.  Each technology product carries with is operational risks that must be managed.  There are legal and commercial relationships to be managed and operational and support costs for the platforms.

The down side of standardization

In this zeal for standardization though we might be blind to a few downsides of standardization.  Here are a couple:

  • Granting a Monopoly – through standardizing on a single vendor removes the competitive pressure on the vendor.
  • Concentration Risk – this is a term from financial markets that means too much of your risk exposure is concentrated in a single security or customer.  What happens when your standard supplier goes bust, which seems to be happening more frequently these days.

Bad things happen when you grant a monopoly

In your company it is no different than the broader economoy, when you grant a monopoly in a given sector, you eliminate the competitive pressure for a supplier to deliver quality and innovation.  You also put all your eggs in one basket.

What’s your view

Is this just too hypothetical? I’d like to hear other’s view on the topic.  So, please comment or direct me to your blog.

Target State Architecture without a Project

July 28, 2009 1 comment

Today I found myself explaining why the Department or Domain I have been responsible for nearly 6 months has not Target State Architecture.  Hmm.. you say.

Seriously – As an IT Architect responsible for a particular business domain in my company, one of the things I should create and maintain is the Target State Architecture.  If you’re an enterprise architect, this is quite obvious.  The problem I faced in developing target state though was the notion that in a Global Financial Crisis when a department head has decided they are not going to spend any money in the fore near future, the target state looks pretty much like the current.  It’s not the Dream State we’re talking about.  So, getting motivate to draw a diagram or write a description about something that isn’t likely to be is not fun. Read more…

Data Migration

June 16, 2009 Leave a comment

Is it just me, or is Data Migration suddenly interesting?

I recently joined a big Australian Bank and noticed there are many data migration exercises to come as a result of a recent merger and major system replacement programs in the works.  I thought with so much data migration work in the pipeline, it would make sense to standardize the way data migrations were performed and perhaps even setup a competency centre or the like.

Some Research

I know a small consulting firm in Sydney named Lucsan Capital that specializes in helping financial markets firms with implementations of systems such as Murex and Calypso.  They have been engaged in many system implementation projects and found a need for a tool that combines Data Migration, Reconciliation and Process Management into a single easy to use platform.  They’ve built this tool called LMIG and use is as a practice aid on their implementation projects.

What’s the competition doing?

Though I have been pretty impressed with what I’ve seen of LMIG and the Lucsan people, before going too far, I thought I better do a bit of research to see what the competition has to offer.  I looked at Informatica and IBM Infosphere.  Both of these tools are leading ETL products and obvious candidates.  But there is definitely a dinstinction between the requirements of large mission critical ETL platforms – things that populate your data warehouse or act as information gateways and the needs of a project team working to quickly and safely migrate the data from one or more legacy systems to the target environment.

Informatica Data Migration Solutiona  contains information on the work informatica has been doing on both the tool and their Velociy Methodology to adapt to requirements of Data Migration.  There is also a fair bit of research by Bloor in this space which looks at the market opportunity and competing products.

Both these products appear to be world class.  Where the seem to fall down is their being almost too good.  They both seem to have many modules and options and many moving parts for their full deployments.  For example there are developer studios, process servers, schedulers, etc… all these things have to be idenfitied and costed in your final solution.  That may be suitable in the case of building a stable ETL environment.  But when working in Data Migration, you need a bit more agility and simplicity.  Something that gets the job done but doesn’t become the focus of your entire project.  Keep in mind, Data Migration is really just a necessary evil to achieve a strategic objective such as system consolidation or upgrade from a legacy to a shiny new system 🙂  You need to be sure your DM solution doesn’t divert your attention from the real objective.

How about OpenSource?

I generally love open source and Java for everything.   A few of my old colleagues at Macquarie Bank turned me on to Talend which is an OpenSource Data Integration platform.  It has a data profiling engine which sounds very interesting.  If I had a development team working for me, I’d probably be keen to go OpenSource.  But at the moment, I’m looking for something that is out-of-the-box and easy for Business Analysts to use.  I’ll look at Talend a little bit later.


I’d love to tell you the conclusion.  But I’m afraid the jury is still out.  I’m really looking forward to getting past the analysis stage and getting on with delivering some benefits to the business in terms of greatly reduced lead times to Data Analysis results and migration of data.  Let me know if you have any views on this topic.


Performance Monitoring and Capacity Planning

December 5, 2008 Leave a comment
At work we’ve just finished our first month of live operation for a large new system.  Now that the teething problems are past, we need to be sure we understand how the system performs as it grows. 

I was hunting around for tools like Orca or Cacti which provide graphical presentation of Load vs Capacity trending over time.

This is interesting information, but it is only part of the question. While hunting I came across Adrian Cockroft’s Capaciity Planning Blog. He shared an interesting concept he picked up at eBay called time-to-live which is a function of headroom and growth.