organizations to make informed decisions about which applications to keep, invest in, retire, or consolidate. For beginners in Application Portfolio Management (APM), creating a simple, clear, and actionable scoring model provides a structured way to assess the relative value and priority of applications.
This section provides a step-by-step guide to building a basic scoring model tailored for organizations starting their APM journey.
1. Purpose of a Scoring Model
A scoring model helps organizations:
- Objectively evaluate applications using predefined criteria.
- Prioritize decisions based on measurable data rather than subjective opinions.
- Align application decisions with strategic goals, such as cost reduction, risk mitigation, or digital transformation.
- Enable transparent communication with stakeholders about the rationale behind portfolio decisions.
Tip: The scoring model should be simple enough to implement but flexible enough to adapt as the organization’s APM practice matures.
2. Defining Key Evaluation Criteria
The foundation of any scoring model is the criteria used to assess applications. For beginners, focus on the following essential categories:
- Business Value
- How critical is the application to achieving business objectives?
- Does it directly support revenue generation or customer satisfaction?
- Scoring example:
- 5 = Mission-critical to business operations.
- 3 = Important but not essential.
- 1 = Minimal business impact.
- Cost
- What is the total cost of ownership (TCO), including licensing, maintenance, and support?
- Is the cost justified by the application’s usage and value?
- Scoring example:
- 5 = Low cost relative to value.
- 3 = Moderate cost.
- 1 = High cost with limited ROI.
- Usage and Adoption
- How widely is the application used within the organization?
- Are user satisfaction and adoption levels high?
- Scoring example:
- 5 = High usage and positive user feedback.
- 3 = Moderate usage or mixed feedback.
- 1 = Low usage or frequent complaints.
- Technical Health
- Is the application technically sound, or does it pose risks (e.g., outdated technology, poor performance)?
- Scoring example:
- 5 = Modern, well-maintained, and high-performing.
- 3 = Some technical issues but manageable.
- 1 = Significant technical debt or risk.
- Risk and Compliance
- Does the application comply with regulatory requirements (e.g., GDPR, HIPAA)?
- Are there security vulnerabilities or operational risks?
- Scoring example:
- 5 = Fully compliant and low risk.
- 3 = Moderate risk or compliance gaps.
- 1 = High risk or severe compliance issues.
3. Assigning Weights to Criteria
Not all criteria are equally important. Assign weights to each criterion based on organizational priorities. For example:
- Business Value: 40%
- Cost: 25%
- Usage and Adoption: 15%
- Technical Health: 10%
- Risk and Compliance: 10%
Tip: Engage stakeholders to determine which criteria are most important and adjust weights accordingly.
4. Creating a Scoring Template
A simple scoring template ensures consistency across evaluations. Use a table or spreadsheet format with the following columns:
- Application Name
- Business Value Score (1-5)
- Cost Score (1-5)
- Usage Score (1-5)
- Technical Health Score (1-5)
- Risk Score (1-5)
- Weighted Total Score
Example calculation for an application:
Criterion | Score | Weight | Weighted Score |
Business Value | 5 | 40% | 2.0 |
Cost | 3 | 25% | 0.75 |
Usage and Adoption | 4 | 15% | 0.6 |
Technical Health | 3 | 10% | 0.3 |
Risk and Compliance | 2 | 10% | 0.2 |
Total Weighted Score | 3.85 |
5. Automating the Scoring Process
For beginners, automation can streamline the scoring process. Tools to consider:
- Spreadsheets: Use formulas to calculate weighted scores automatically.
- APM Tools: Many entry-level APM platforms include built-in scoring features.
- Business Intelligence Tools: Platforms like Power BI or Tableau can visualize scores and trends.
Tip: Keep the process simple initially to avoid overwhelming teams.
6. Interpreting Scores
Once scores are calculated, interpret them to guide portfolio decisions:
- High Scores (4.0–5.0): High-value applications worth retaining or investing in.
- Medium Scores (2.5–3.9): Applications that may need improvement, monitoring, or potential replacement.
- Low Scores (1.0–2.4): Candidates for retirement, consolidation, or replacement.
Tip: Combine quantitative scores with qualitative insights from stakeholders to ensure a holistic evaluation.
7. Communicating Results to Stakeholders
The scoring model’s results should be presented in a clear and actionable format:
- Visual Summaries: Dashboards or charts showing top-performing and underperforming applications.
- Prioritization Reports: Lists of applications grouped by score range.
- Decision Recommendations: Suggested actions based on scores (e.g., retire, invest, replace).
Tip: Tailor the presentation to your audience (e.g., detailed for IT teams, high-level summaries for executives).
8. Continuous Improvement of the Scoring Model
As the organization matures, refine the scoring model by:
- Adding new criteria, such as innovation potential or customer satisfaction.
- Adjusting weights to reflect changing business priorities.
- Incorporating advanced analytics or AI tools for more sophisticated scoring.
Tip: Schedule periodic reviews of the scoring model to ensure it remains relevant and effective.
9. Benefits of a Scoring Model
Implementing a scoring model provides numerous benefits, including:
- Increased transparency and objectivity in decision-making.
- Clear prioritization of applications for investment, rationalization, or retirement.
- Improved alignment of the application portfolio with organizational strategy.
By creating a scoring model, organizations can systematically evaluate their application portfolio and take actionable steps to optimize it. This foundational practice not only drives early success in APM but also sets the stage for more advanced techniques as the organization grows in maturity.