Improving IT Maturity in Higher Education: Sponsored by
Improving IT Maturity in Higher Education: Sponsored by
in Higher Education
Sponsored by
About This Toolkit
This HDI toolkit is a series of practical, instructive job aids designed with the IT service management
practitioner in mind. Each area of focus can be studied and used by itself, or as part of the whole. This
toolkit will help clarify IT maturity, as well as the steps necessary to achieve it in your organization.
4. Focus Area 3 - Technologies: Choosing the right tools and using the right tools
Maturity, as it pertains to IT and IT service management, means the state of people, processes, and
technology in the organization, and the achievement of improvement goals based on accepted
measurements. Generally, a model is chosen (two are discussed below) and assessments are
performed, either by the organization itself (self-assessment) or by an auditor from an external
organization.
Maturity matters because it shows the capability of IT and IT service management organization to
serve as a ready, willing, and able partner to achieve business or organizational goals in a stable and
predictable way.
In general, states of maturity move from the chaotic or ad hoc to the optimized in defined steps
according to the model. According to CMMI, “each maturity level provides a layer in the foundation for
continuous process improvement.”
Generally, stages of maturity move from the unorganized and reactive to the organized, managed,
measured, and proactive. The similarities between these two maturity models should be evident:
the terms initial, defined, and managed appear in each. The most mature state in each is about
optimization.
Increasing maturity
Initial: Unpredictable and poorly controlled.
case-by-case basis, ad hoc.
Repeatable: Procedures are followed, but Managed: Process characterized for fixed-length
individual knowledge is depended upon. projects, and reactive.
Defined: Procedures are documented and Defined: Process characterized for the
standard. Becoming more proactive. organization and proactive.
HDI research tells us that about 6% of support centers use CMMI. While ITIL is the framework of
choice in at least 50% of organizations, how many of those organizations assess their maturity using
the ITIL model is unknown.
If you choose to use a maturity model (and these are only two of many), you will need to consider
which one is right for your organization. One way to assess that is to get to know organizations that
are similar to your yours and ask them which maturity model they use, and why. Great ways to find
answers to these questions include asking in LinkedIn groups, in-person at local or national industry
events, or in dedicated community groups such as HDIConnect.
Self-assessments are most useful for organizations that are seeking to undertake improvement
programs, but do not have a clear idea of their current state. Many self-assessments are available
for low or no cost, and can help establish the needs of the organization for improvement, including
specific deficiencies and strengths.
Organizations tend to either overestimate or underestimate their own maturity level, and so self-
assessments should be taken with some caution. If the assessments are well designed, the questions
will produce a far more accurate view of the organization than would just making an internal
statement (a guess). Be honest in your evaluation of where you are.
Self-assessments provide a questionnaire of some length, asking about very specific aspects of
the IT organization and at least some about the organization as a whole. The responses are scored
in some fashion, and a report is delivered back to the assessment-taker with the results. Self-
assessments should not be considered definitive, but they do give the organization a snapshot of
the current state of maturity, and can be used in building a business case for having a professional
assessment done.
Professional assessments are done by certified third parties who have no interest in making
your organization look better than it is. Self-assessments are not comparable to full, professional
assessments.
It is best to make a decision about which maturity model is best for your organization before taking
any self-assessments, since each assessment is based on a specific model. Your organization should
not assume that if a self-assessment under the CMMI model says you are at Level 4, you are at Level 4
in all maturity models. Specifics vary, as will the results of your self-assessment.
Because they are comparatively low in cost, self-assessments can be taken often, and may serve as
mileposts in between professional assessments.
In summary:
• Choose a self-assessment that suits your organization
• Be honest
• Don’t confuse self-assessment results with a professional assessment
Collecting metrics and measurements is easy; deciding which ones are important and useful is more
difficult. This in itself is part of maturity: understanding what matters to the organization as a whole,
and measuring in ways that illuminate that understanding. Unless performance is measured, it is
impossible to gauge whether it has improved. Therefore, measurements form part of the foundation of
maturity.
Metrics can be measured and applied at the operational, tactical, or strategic level. A classic error in
the world of service and support is mistaking operational metrics for strategic ones, and reporting
them up to executives who are looking for something quite different.
• s maturity increases, metrics are increasingly aligned to key performance indicators which are
A
more organizationally aligned (see below).
• Consistency, repeatability, and predictability are increased through the considered collection
and analysis of metrics.
• Metrics should increasingly focus on quality over quantity, and outcomes over activities
• Activity-based metrics are the most widely quoted and used in the literature of the
support center. Call or contact volume, handle time, speed to answer, and the like are all
based on the activities of support, not on business outcomes. The activities of support
are mostly reactive: something broke, someone contacted support, and that particular
incident was resolved, for example. These metrics can be “gamed” quite easily.
Example: First contact resolution – User calls unable to access an application. The
support analyst resets the user password and marks the case resolved. Ten minutes
later, the user contacts the support center again because that did not work. A new
ticket is opened and resolved on first contact. Result: Two tickets are scored as FCR
when there was only one.
• O
utcomes-based metrics are more tactical and sometimes strategic: as a result of
a changed support process or technology, the sales team was able to convert more
potential orders, or the marketing team had a higher response rate, for example. These
metrics are aligned with the organization’s goals as opposed to the goals of the support
center alone. These metrics are harder to “game.” The example used by the Open
Customer Metrics Framework is the Customer Effort Score, which measures how easy
or difficult it was for a customer to obtain and benefit from support.
Identifying stakeholders
Perform an inventory of stakeholders to make sure that you are including all of them when you plan to
purchase a tool or technology. The example above of desktop support lacking reporting capabilities is
one consequence of overlooking a stakeholder. Another might be that your organization is considering
a knowledge management group (not just for IT), and may be planning to unify knowledge
management in the next two years. If the tool you are looking at lacks the capability to expand, it will
be a poor decision.
Implementation considerations
Implementation can be expensive and complicated. How many procedures or processes will have to
be changed? How much of the implementation work can be done by existing staff, and how much will
need to be contracted out with the tool developer or a third party?
The watchwords of frameworks and methodologies are adopt and adapt. It is not necessary to choose
one “right way” to guide an organization; elements of several frameworks and/or methodologies
can be used together. Many articles, blogs, and white papers have been published on the topic of
integrating different frameworks and methodologies, so do not let anyone convince you that you have
to have “an ITIL shop” or a “DevOps shop.” There is no reason you cannot use both sets of guidance—
and add more if your organization needs them or decides they will help.
Examples:
• ITIL and KCS
• ITIL and COBIT
• DevOps and IT service management
The key is not to become bound by an ideology rather than taking the ideas various frameworks have
to teach you and building upon them for your organization’s future.
Set the bar for performance a little above where you have it now. Perhaps raising your commitments
under your service level agreement (SLA) by one percent per quarter will do it. Year over year, that
performance will begin to show great results.
Our mission is to elevate the customer experience through the development of the
technical support industry.
About TeamDynamix
Service and project portfolio management designed specifically for higher education.
A single platform bringing together IT service management extending to Facilities,
Resident Life, Admissions, HR, and more—with project portfolio management.