Monday, March 5, 2012

IT Annual Reports

The following Blog post was originally published on the NTEN.org site.

March 5, 2012

One of our senior managers asked me soon after I started at IFRC, “Why is ISD the largest department with the largest budget in our division; what are you doing?”  This is not an unusual question; however, it was wake-up call that the value that IT was adding (and could be adding) to the organization was invisible.

I was first introduced to IT Annual Reports by the CIO Executive Council, where a number of members posted their reports as examples.  None of them grabbed me.  They were too technical, too tactical or too focused on internal customers.  Last year I discovered the annual Intel IT Performance Report.  It was clear, easy to read and answered the fundamental questions of what value are we adding, what are the issues we see for the organization, and where are we going? We set out with this as a model.

As we were setting strategic goals and objectives, we also asked ourselves: how do we know if the Information Services Department (ISD) team has moved the needle at the end of the year? What does success look like and who cares?  This is not a frivolous question.

Since the ISD team is about serving our customers (I take issue with those who say we don't[1]), we need to be clear about the audiences we serve.  We see five:
  1. Beneficiaries (everything we do must be measured ultimately by its impact in improving the lives of the most vulnerable).
  2. Field-workers in our National Societies who work directly with beneficiaries, delivering our programs
  3. Regional offices who work directly with local National Societies
  4. Headquarters staff who provide support for all of the above
  5. Our senior management team and governing boards who oversee our work
Your audiences may differ, but be clear about who benefits from your programs and who works closest with them.

When Tom Murphy, former Chairman and CEO of Capital Cities / ABC, Inc., was Chairman of Save the Children's Board, I asked him what was most important to him in taking the pulse of how an international child-focused organisation was doing.  He noted four questions, which became our (more positive) Murphy's Laws:
  1. Are we reaching more children?
  2. Are donations growing?
  3. Is the press good?
  4. Are employees happy?
These became the core questions that we reinterpreted for an IT organisation.  For our year-end scorecard, we posed ten questions from the standpoints of our audiences and our strategy as leading indicators of how we are doing. Here are the ten, with some sample things we measure for each:

1.  Are more beneficiaries being reached?
We track technology used by beneficiaries (the top of the pyramid[2]), our beneficiary budget spending versus lights-on spending, and number of National Societies completing our Digital Divide capacity building program.

2.  Is our technology investment growing?
We benchmark IT spending against revenue and operating expense.[3]

3.  Are the customers happy?
We survey users annually on all aspects of our IT services and publish our customer satisfaction index for headquarters and the Field.[4]  We also track the number of thank-you notes we receive each month from our employees.

4.  Are my problems getting solved?
We look at the percent of service calls solved on the first call and related data from our satisfaction survey.

5.  Are my projects getting done?
We track timeliness and cost of our larger "flagship" projects against original forecast.  We also track the project team's "confidence" index for completing projects on target.[5]

6.  Are new technologies being delivered?
We monitor adoption rates of new products like mobile phones, and those selected by users. We also report out on new capabilities we deliver like HD video conferencing, on-line meetings, larger email-boxes and other “goodies”.
We also survey users on use of emerging tech tools or applications.

7.  Are National Societies getting stronger? (Is the Field getting stronger?)
Our Digital Divide program and number of countries assisted is our key metric. We track MOU΄s signed, projects completed, and the ratio of budget spending for beneficiaries to IT costs.

8.  IS IT getting greener?
We count servers retired and on-line meetings held instead of travel for in-person meetings.  We also calculate savings in our carbon footprint.[6]

9.  Is IT using financial resources efficiently?
We report budget versus actual and a handful of ratios that we benchmark against other international organisations annually.

10.  Are our systems reliable?
We track "good days" and "bad days" and chart these monthly[7]


Answering these ten questions tells our audience how we are performing for them. If I had to pick two, growing our reach to those in need, and happy internal customers would be at the top of my list.   What value do we add to the organization?  As the title of our inaugural Annual IT Report states, we deliver mission relevant IT.  Ask our customers in the Field and HQ if we deliver on this promise.  We did.





[1] See Wikipedia the discussion and references on internal customers, here: http://en.wikipedia.org/wiki/Customer ; for an interesting comparison, see http://ezinearticles.com/?Myth-of-An-Internal-Customer&id=2578986
[2] For a discussion of the IT Pyramid, see my Blog entry on “Six Views on Innovation”, section 4.
[3] We benchmark against the Gartner and the CIO4Good NGO surveys.
[4] A US colleague tracks one metric: "How likely is it that you would recommend our company to a friend or colleague?"  See the Net Promoter article in Wikipedia, here: http://en.wikipedia.org/wiki/Net_Promoter
[5] The Project Confidence Index is a periodic average of the front-line project team’s individual and subjective assessment on a 1-10 scale how likely the project will deliver on its objectives (on time, on budget, and within scope.)
[6] We use the carbon footprint calculator from Terrapass (www.terrapass.com). See their paper on Carbon Offsetting & Air Travel for a good review of the carbon savings data.   The Nature Conservancy also has a calculator worth comparing. 
[7] See Hallmark case: James R. Johnson, “Magnifying the Problem “, CIO, November 15, 1992, pp. 34-38.



"The postings on this site are my own and don't necessarily represent positions, strategies or opinions of any of the organizations with which I am associated."