Special Edition:
SAP User Experience

back Edition Overview

Leading Article

New SAP UX Projects

User-Centered Design

Design and Visual Design

Accessibility

More Project Reports

Events

For Your Reference

 

 

The SAP User-Centered Design Process

By Design & Research Methodology, SAP User Experience, SAP AG – December 15, 2006

This page presents SAP's new design process, called SAP User-Centered Design (UCD).

Overview

 

Introduction

The SAP User-Centered Design (UCD) is a philosophy and set of methods focused on designing for and involving end users in application development to achieve high-quality user experiences and high-quality SAP products. The SAP UCD process is based on proven, essential design processes and accountability across the entire design lifecycle.

UCD results in more usable and satisfying systems, making SAP software more effective, efficient, easy to learn, pleasant to use, and predictable – in essence, a high-quality user experience, contributing to high-quality products, and ultimately, more sales, market share, and revenue for SAP.

UCD is based on four fundamental principles:

1) focus on real end users, 2) validate requirements and designs, 3) design, prototype, and develop iteratively, and 4) understand and design for the holistic user experience. The SAP UCD process embraces these principles in three functional phases:

SAP User-Centered Design Phases

Figure 1: The phases of SAP User-Centered Design

 

Fundamentals

The foundational principles and functional phases of UCD are applicable to all SAP development. The SAP UCD process primarily involves iterative end-user activities that power the entire process; specifications that document work; and validations that insure success. The activities are organized around understanding end users' needs, scoping and defining interactions based on that understanding, and designing user interfaces (UIs) from the interaction definitions.

SAP User-Centered Design Overview

Figure 2 : SAP User-Centered Design in the context of SAP development

 

Overview

Phase 1. Understand Users

Product development begins with a vision of a product, which includes a vision of the users for that product. A vision, however, is not enough to start design. Every product has different users. Some products have many different types of users. Even new versions of old products have a changing user population. Business, and business software, is a complex and ever-changing domain. Therefore, it is critical to accurately understand end users' needs and pain points.

The SAP UCD process relies on iterative user research to understand users and their needs. Knowledge databases of existing users are a good start; however, it is important to involve potential end users at the onset of UCD. Focus groups, interviews, and field research form the basis of the first two phases of UCD. To ensure that end users and their needs are sufficiently understood, the first phase examines the user population, their work, and their needs.

Phase 2. Define Interaction

The most common failure point in UCD processes is transferring the understanding of users to UI design. Even simple products struggle without a clear definition. For SAP products, the key is to define interaction first, without designing it.

First, all the user research conducted is organized and summarized in a user research synthesis, leading to user profiles, work activities, and requirements for the intended user populations.

The summarized user research information feeds directly into use cases, which define a products use; i.e., interaction. To start use cases, a subset of work activities are identified and organized into a coherent product with a high-level overview on how information will flow throughout the application. Then the specified work activities are captured in further detail with goal-based use cases. The use cases show steps to accomplish task goals and the data needed to perform interactions. The data definitions are the only elements of an "interface" that need to be determined in this phase; therefore, dialogs, buttons, tabs, labels, and all other interface elements are not mentioned.

Completed use cases are validated with the intended user population. This is a checkpoint to see if the vision is being achieved and the value is clear to users and SAP.

Phase 3. Design UI

The third phase of UCD is to design the UI, evolving directly from the interaction definition. Product scope and interface organization are clear from the high-level information organization, and UI components are clear from use case steps and data.

A primary concern with design is to not get locked into a single solution too early. To help prevent design traps, this phase is explicitly broken into two stages; low-fidelity prototyping and high-fidelity prototyping. Low-fidelity prototypes allow experimentation and rapid evaluation. High-fidelity prototypes provide exacting design and behavior previews of the final product that specifies what is to be coded. Iterative user evaluations at both stages are geared to be fast and effective in improving UI; design feedback, rapid iterative evaluations, and usability evaluations.

Development Validation

After UIs are designed, communication with development must continue. Two key reviews are included, one UX review before handoff to development to ensure design quality, and one after development to ensure design compliance.

Lastly, completed products are formally tested for usability in benchmark usability test. These tests monitor ongoing improvement over time, against prior products and versions, as well as the competition.

 

Ownership

While UCD is a shared process across SAP, owners are designated for each stage. Stage owners are responsible for completing their work to best enable the next step in the process, ultimately leading to high-quality designs and products.

Understand Users

      Iterative user research
(Field research, focus groups, interviews)        
Solution Management w/ UX
     
Define Interaction
  User research synthesis Solution Management w/ UX
  Use Cases Solution Management w/ UX
  Specification Solution Management w/ UX
     
Design UI
  Prototypes UX w/ Solution Management
  User evaluations UX w/ Solution Management
  Specification UX w/ Solution Management
     
Development Validation
  UX reviews UX / Solution Management / Dev
  Benchmark usability tests UX / Solution Management / Dev

 

Levels of Support

All product development projects are different; however, a UCD process needs to be implemented in all of them. Team size, time availability, material resources, and product complexity make each development project unique. Three levels of UCD support are available to deal with varying resources and constraints.

  • 'A' level -- apply the full UCD process
  • 'B' level -- implement the validation steps & recommended activities
  • 'C' level -- utilize recommended methods and specifications

All large-scale development efforts within SAP should apply the full UCD process, support level 'A'. The full UCD process should easily fit within a 1+ year development plan.
Medium-scale development efforts should strive to fulfill the full UCD process. Full UCD is the best insurance for achieving high-quality user experience in our products.

Some medium and small-scale development efforts (e.g., 'Agile') should at a minimum implement the validation steps of the UCD process, support level 'B'. The validation steps are ongoing UCD checks of the progress towards achieving high-quality user experience. Validation insures motivation to sufficiently address the philosophical foundation of UCD, despite project tradeoffs that limit specific steps of the full UCD process. If at all possible, early end-user consideration (focus groups), use cases, iterative design, and some user evaluation of designs should be explicitly incorporated into any development project.

Small, exploratory, and low-impact development efforts can utilize various steps of the UCD process as they are useful. Fundamentally, the philosophy of Understanding Users, Defining the Interaction, and Designing the UI still apply to even the most constrained projects. If at all possible, early end-user consideration (focus groups), use cases, iterative design, and some user evaluation of designs should be explicitly incorporated into any development project.

In addition to the main support levels, the UCD process has the flexibility to adapt to uniqueness through the depth and breadth of: 1) user research, 2) iteration, and 3) general effort. The first two are explicit choices of process; the third is implicit across the whole process.

Ultimately, the consideration of support level and depth and breadth of effort needs to balance the unique project constraints with the ultimate goal of developing high-quality SAP products.

The following diagram shows a rough timeline for an "average" full UCD project. Note that the actual times will vary for each phase, depending on scale. The widths are only meant to represent relative comparisons between the different activities.

Rough timeline for an "average" full UCD project

Figure 4: Rough timeline for an "average" full UCD project (click image for large version)

 

Details

Summary Table

Step
Goal
Deliverable
1. Understand Users    
Iterative user research (focus
groups, interviews, and field
studies)
Collect up-to-date, accurate, in-depth information
on intended user populations
Each user activity
summarizes findings in
a report
Specification: User profiles,
work activities, and user
requirements
Describe user profiles and work activities for the
target user population; derive user requirements
from user profiles and work activities
Specification
2. Define Interaction Organize and summarize user research from
Phase 1, Understand Users
User research
synthesis report
User research synthesis    
Use cases: High-level
information organization, use
cases, and data flows
Translate user work activities associated with user
requirements into goal-driven, interactive, step-bystep
use cases, appropriate for the user profiles
Use cases
specification
Use cases validation Validate user understanding and product definition
with end users who use the product and customers
who buy the product
(a component of the
use cases
specification)
3. Design UI    
Low-fidelity prototypes and key
decisions
Create quick, inexpensive, flexible design mockups
of product components, use cases, etc.
Designs (and
specification where
appropriate)
Iterative user evaluation
(design feedback, rapid
iterative evaluations)
Improve design by evaluating usability issues
associated with low-fidelity prototypes
Each user activity
summarizes findings in
a report
High-fidelity prototypes and
interaction behavior
Create stand-alone prototypes of real applications
that mimic full design and interactive behavior as
closely as possible
Prototypes and UI
design specification
Iterative user evaluation (rapid
iterative evaluations, usability
evaluations)
Improve design by evaluating usability issues
associated with high-fidelity prototypes
Each user activity
summarizes findings in
a report
Development Validation    
UX review: UI design Review UI design for quality, before development UI Scorecard
UX review: UI compliance Review completed UI development for compliance
with UI Standards, after development
UI Scorecard
User validation: Benchmark
usability test
Benchmark completed product usability with a
standardized formal usability test
Benchmark usability
test report

1. Understand Users

Iterative user research (focus groups / interviews / field studies) (as needed)

User research is needed to develop up-to-date, accurate, and in-depth descriptions of users' work needs. Given how quickly these needs can change in fast-moving high-tech environments, it is critical to continuously talk to end users to insure that the information used to inform product design is current, and not based on dated information. Where possible, users' needs should be assessed in their normal work contexts (e.g., with field research) or by interviewing them (e.g., in focus groups or individually).

Performing adequate user research at this stage is a resource-intensive activity that should be planned across multiple products. Individual teams should attempt to collaborate and participate in larger research efforts, mining existing information, and/or conducting focused research to form an appropriate user assessment.

How to: Focus groups

Focus groups are coordinated group interviews. As with field studies and interviews, there are many different types of focus groups.

Recommended approach: Interview groups of people who perform similar jobs, following interview scripts that probe current roles, responsibilities and tasks, existing product feedback, and desired enhancements and functionality.

Deliverable: A report on the exact method used and resulting targeted user information.

How to: Interviews

Interviews are simply question scripts that probe the same issues as focus groups, albeit individually. This is useful for difficult to reach (e.g., C-level), remote, or rare users.

Recommended approach: Interview individuals who perform similar jobs, following question scripts that probe current roles, responsibilities and tasks, existing product feedback, and desired enhancements and functionality.

Deliverable: A report on the exact method used and resulting targeted user information.

How to: Field research

Field research involves observing users in their natural work environments.

Recommended approach: Observe users as they conduct their work and to ask them questions about the work observed (note: the software they use is only a part of this method, in addition, focus on all the users' activities and tools used to perform job tasks).

Deliverable: A report on activities, reasons for activities, sequences of activities, tools used, user wants & needs, pain points, etc.

2. Define Interaction

User research synthesis (required)

The User Research Synthesis consists of user profiles, descriptions of work activities, and user requirements of the targeted user population. User profiles describe users' distinguishing characteristics and their work environment that can impact the product design. Work activities are the tasks that users do to accomplish their jobs that should be represented in the targeted product. User requirements are primarily derived from the work activities observed and discussed in the user research; however, direct requests are also considered.

Recommended approach: Synthesis is fast and flexible. It utilizes white boards, conference tables, post-its, note cards, etc. The idea is to use an environment ideas can be rapidly added and revised.

Deliverable: The User Research Synthesis Report details the user profiles, work activities, and user requirements distilled from user research.

Use cases (required)

Use Cases are: 1) high-level information organization, 2) task flow steps, and 3) associated data for the use case steps. The information from understanding users drives the interaction definition. The primary deliverable is the use cases, sequences of interactions and the data exchanged between one or more users and a system to accomplish goals. The identified work activities and requirements guide the use cases; the user profiles and requirements determine the depth and variation of use cases needed.

Recommended approach: Use cases start with fast and flexible like user research synthesis, in fact the transition can occur at the same time. As use cases become finalized, they are documented in the final specification. Validation with end users is required to confirm the vision is solid and ensure that needs have been met. The validation uses either focus group or interview techniques as described above. The validation write-up is incorporated into the specification.

Deliverable: The Use Cases Specification Template details high-level information organization, task flow steps, and associated data for the use case steps.

3. Design UI

Lo-fidelity prototypes (required)

"You can fix it now with an eraser, or you can fix it later with a sledgehammer."
- Frank Lloyd Wright, Architect

Low-fidelity prototypes are quick, inexpensive, and flexible sketches or mockups of tasks or flows, not the full product. The prototypes focus on concepts, metaphors, and alternatives. The goal is to improve design by rapidly iterating. Using low-fidelity designs ensures that iteration is built into the UCD process. Iteration is the surest path to achieving high-quality user interfaces.

How to:

Recommended approach: Use fast and flexible media: 1) pencils, pens, paper, and post-its, or 2) quick computer sketches and wire-framing tools. The goal is to expand many designs into a few. Creating more, quick prototypes of low fidelity is highly recommended at the beginning of this process. Fewer prototypes with somewhat more fidelity are appropriate for later evaluation.

Iterative user evaluations (as needed)

Iterative user evaluations of low-fidelity prototypes are intended to match the rapid, iterative pace of design. Two methods are recommended. The first is using SAP employees to offer quick interface design feedback. The second is soliciting feedback from users for rapid, iterative evaluation based on tasks. Note that more than one iterative user evaluation activity can occur.

How to: Design Feedback

Recommended approach: Gather quick design feedback by showing early designs to internal SAP "users".

Deliverable: A specific report is not necessary; however, as appropriate, design decisions should be captured in the prototypes and summarized in the specification.

How to: Rapid Iterative Evaluation

Recommended approach: Conduct rapid user evaluations. In contrast to traditional usability evaluations, in which a single design is tested across a number of users, in a rapid iterative evaluation design changes are implemented after each user. Because of the emphasis on making rapid design improvements, traditional usability metrics are not collected unless they specifically contribute to rapid design decisions.

Deliverable: A report documenting the many changes that are possible during the study.

High-fidelity prototypes (required)

High-fidelity prototypes are stand-alone imitations of real applications that mimic the full design and interactive behavior as closely as possible. They encompass the full Interaction definition where possible (information architecture, use cases, and data flows) and incorporate the lowfidelity design decisions. The goal is to specify complete product interaction design before actual development.

How to:

Recommended approach: Create high-fidelity prototypes with full-featured authoring tools (html editors, Design Studio), and check them into the UI Gallery. Design is based on common interface elements.

Iterative user evaluations (as needed)

Iterative user evaluations of high-fidelity prototypes can utilize either rapid iterative evaluations or more traditional usability evaluations. The primary objective is to improve the interface design rather than to benchmark usability. More than one iterative usability evaluation activity can occur.

How to: Rapid Iterative Evaluation

Recommended approach: Use the same evaluation methodology as described for low-fidelity evaluations. The primary difference is the relative difficulty in making changes to more robust high-fidelity prototypes. Potential design changes should be anticipated; and if possible, multiple alternatives should be prepared and shown to users in one session.

Deliverable: A report similar to one prepared for low-fidelity designs.

How to: Usability Evaluations

Recommended approach: Conduct usability evaluations similar to formal usability tests (e.g., CIF, described below), except purposefully and frequently solicit feedback from users about usability and design at the expense of valid usability metrics (note: formal usability testing reverses this philosophy by limiting user feedback to ensure valid metrics). Solicit feedback from multiple users to evaluate a final prototype design. Evaluate usability across the full breadth of the prototype and then delve into identified issues as needed. Collect and report usability metrics if they do not inhibit user feedback.

Deliverable: A report that identifies usability issues from user feedback, supplies design recommendations for usability issues, and includes usability metrics as appropriate.

Specification: High-Fidelity Prototypes and other interaction behavior

The primary deliverable is the full prototypes. Associated design information for the prototypes is included in a report to detail interaction behavior that is not readily apparent in the prototypes themselves (e.g., describing how all errors are handled).

Development Validation

--- UX review: UI design --- (required)

The UI design review is a check of UI design quality by the User Experience team. Component and flow design is reviewed in detail prior to handing over high-fidelity prototypes to development.

How to:

Recommended approach: Designers are responsible for soliciting a review of full prototypes before development begins. Sr. VP of User Experience and the UI Standards team will review all products.

Deliverable: A simple list of action items is supplied by the UI Standards team at the conclusion of the review. See the guidelines for more information.

--- UX review: Standards compliance --- (required)

The standards compliance review is a final 'fit-and-finish' check of SAP UI Standards before code freeze.

How to:

Recommended approach: Product teams and designers are responsible for soliciting a standards compliance review after product development is completed. Sr. VP of User Experience and the UI Standards team will review all products.

Deliverable: A compliance score and a listing of compliance issues written by the UI Standards team.

--- User validation: Benchmark Usability Test --- (required)

A benchmark usability test is a formal usability test on a real product that provides quantitative, benchmarking metrics. The deliverable is an SAP variation on the ANSI/INCITS 354-2001: Common Industry Format for Usability Test Reports (http://zing.ncsl.nist.gov/iusr/), which has become an accepted international standard (International Organization for Standardization – ISO; www.iso.org).

How to:

Recommended approach: A standard benchmark usability test is required to provide valid usability metrics. Tasks should be selected that represent the products' functionality, typically matching the corresponding use cases. A set of representative users from the identified user population should be tested. Interaction with users and probing for 'improving the product' feedback may still be done, but it is limited.

Deliverables: Two reports are required: 1) an external customer document, and 2) an internal confidential document. The external document focuses on standardized measures and dataanalyses following the SAP CIF-style report template. The internal document enhances the external report with more descriptive measures and specific recommendations for usability improvements.

 

To top top