Software Engineering for UsabilityPrev section Next section Top of document
6. Methodologies and Processes for UI Development
- 6.1 Integrating UI development into software development
- 6.2 Usability Engineering
- 6.3 Task-Centered Design Process
- 6.4 RADical Software Development
- 6.5 Object-oriented methodologies
- 6.6 Structured methodologies
- 6.7 Essential modeling
- 6.8 Formal and semi-formal methods
- 6.9 RMP (Requirements, Models, Prototypes)
- 6.10 Typical development lifecycles
- 6.11 Tools and Environments
- 6.12 Standards
Curtis and Hefley (of the Software Engineering Institute) identify three requirements for integrating user interface engineering into product engineering:
"... a process needs to be defined for specifying, designing, building, testing, and evaluating a user interface."
".. this defined process needs to be integrated with the defined process used for developing the remainder of the product (hardware, software, etc.)"
"... The organization must have an established project management discipline, so that it can manage a well-defined process and avoid making commitments that even a sound engineering process could not satisfy." (Curtis and Hefley 1994).
They give Nielsen's list of the stages involved in developing an interface (Nielsen 1992) while saying that it is necessary to describe these stages abstractly as part of defining a user interface engineering process, and note that the defined process must include requirements for testing the user interface and coordinating changes made as a result of those tests (in other parts of the software, the user manuals, training programs, etc.).
The field of usability engineering is reviewed in (Whiteside, Bennet et al. 1988; Nielsen 1992; Nielsen 1993). Nielsen summarizes the stages of interface development as:
* know the user: individual characteristics, current and desired tasks, functional analysis, evolution of the user and the job
* competitive analysis
* setting usability goals; financial impact analysis
* parallel design
* participatory design
* coordinated design
* guidelines and heuristic analysis
* empirical testing
* iterative design; capture design rationale
* collect feedback from field use
(Lewis and Rieman 1994) describe their "task-centered design process" as consisting of the following steps:
* figure out who's going to use the system to do what
* choose representative tasks for task-centered design
* plagiarize (using parts of old designs)
* rough out a design
* think about it
* create a mock-up or a prototype
* test it with users
* build it
* track it
* change it
RADical software development is:
"a customer driven application development lifecycle that:
* delivers quality solutions
* is an evolutionary process
* uses continuous application engineering techniques
* is performed by a dedicated professional team
* uses timeboxed project management
* is enabled by powerful development tools
* results in profound productivity benefits." (Bayer and Highsmith 1994)
Note the emphasis on the customer. Bayer and Highsmith describe the use of Joint Application Development (JAD) sessions, and Focus Groups to ensure that customer needs are met. They emphasize the importance of a "jelled" team (see also DeMarco and Lister 1987; Coplien 1994).
They say there are "four key aspects to re-engineering the software development process:
* The overall process, the software development life cycle, must undergo a transformation from a static, documentation orientation to a dynamic, evolutionary, product orientation.
* Doing activities faster requires a skill at `timeboxed' project management techniques.
* An evolutionary life cycle will not yield the desired benefits without promoting high performance, dedicated teams.
* Transformation to a RADical approach does not mean abandoning the critical software and information engineering skills gained in the last 15 years. Instead, these skills need to be redirected into -- continuous application engineering." (Bayer and Highsmith 1994)
Grady Booch says that "... object-oriented design embodies an incremental, iterative process" and so the traditional waterfall model is not very applicable. He points out that the typical object-oriented project's human resource allocation is heavily loaded to the front end of the process - heavy on up-front design and iteration. (Booch 1994) This implies that HCI integration into an object-oriented process should be easier than into a traditional one. Booch says that it is "... dangerous to try to 'completely' analyze a system before even thinking about moving forward with design - such an attempt marks the beginning of analysis paralysis." The parallels with what has been said about HCI are obvious. The move to integrate HCI into software engineering will require similar (but different) changes in the people and organization as a move to object-oriented design. The parallels include the need for a change in mind-set, the need for training, the need for changes in organizational priorities, and the expectation of a high start-up cost which will pay off in the long run.
Of the three standard books on object-oriented analysis and design (Rumbaugh, Blaha et al. 1991; Jacobson, Christerson et al. 1992; Booch 1994), Jacobson, with his "use cases" seems to be the most amenable to HCI issues. Recently, however, these three "big names" have gotten together at one company "Rational" and have come to an agreement on how to unify their various approaches. (Booch, Rumbaugh et al. 1996)
In a recent interview, Booch commented on their unification efforts and standardization of process:
"I think there is reasonable consensus on what distinguishes the successful process from the one that's not successful. The devil is in the details, the notion of architecture driven, incremental and iterative, and use-case driven. I think there's general consensus that's all good stuff. Whether or not you add a lot of ceremony to that or have low ceremony is where the degrees of freedom are." (Meyer 1996)
I note also a new book by Booch which touches on user-centered design: (Booch 1996)
One of the tutorials at the CHI'96 conference described an object-oriented GUI design model having the interesting feature that "...the chasm between task flow and fundamental user interface structure is spanned by several small, fairly easy steps, instead of by the traditional, exclusive reliance on a highly skilled designer's intuition." (Dayton, Kramer et al. 1996)
The MUSE methodology (Lim and Long 1994) is a structured human-factors methodology which can be used to extend any standard software engineering structured systems methodology so that it integrates HCI concerns. The primary focus of the MUSE methodology is in working with Jackson System Development but they discuss how it would work in other cases. It is not clear how the MUSE methodology would incorporate the prototyping and iterative improvement that is essential for good interface design.
In an earlier paper (Dowell and Long 1989), the MUSE authors emphasize that part of the specifications in any human factors engineering method must include specification of the users of the system. This could be charitably interpreted in more modern terms as "know thy user". But their definition of a user as "a system of distinct and related human behaviors, identifiable as the sequence of states of a person interacting with a computer to perform work, and with a purposeful (intentional) transformation of objects in a domain" seems to de-humanize the whole process.
Essential modeling (Constantine 1995) is a design process involving three independent models:
* user role model
* essential use case model
* use context model
Constantine says that use cases (e.g. Jacobson, Christerson et al. 1992):
"are written in terms of the features of a specified or assumed user interface" and so they "are not very helpful for designing the user interface; for this we need essential use cases. ... An essential use case is a simplified and generalized form of use case, an abstract scenario for one complete and intrinsically useful interaction with a system as understood from the perspective of users ..."
For example, in the context of an automatic teller machine, the use case might start: "User inserts card; system reads magnetic stripe, requests PIN; user types PIN; ..." whereas the corresponding essential use case would start "User identifies self; ..."
The "user role model" encompasses the characteristics of the various users of the system (e.g. as might be derived from task analysis/contextual inquiry or other techniques).
The "use context model" is an abstract model of the architecture of a proposed user interface. The various parts of the user interface are represented as labeled boxes. The actual design of an interface involves finding a mapping from these boxes to the widgets that are available in the development environment.
There has been a lot of research directed toward formalizing the specification of user interfaces. (Harrison and Thimbleby 1990; Fraser, Kumar et al. 1994; Rouff 1996) One of the main goals is to have design specifications that are known to be unambiguous and which possibly can be proved to be correct. One major flaw in the ointment is that communication with users is likely to be hindered by such an approach.
One study (Bansler and B¿dker 1993) reports that some companies have found that communication between users and developers was improved by the use of dataflow diagrams in the specifications but that these cases were the ones where the users were technically oriented people (e.g. engineers & electricians). In general, such diagrams are not used for validating design with users: one developer says "...it's hopeless, it really is."
(Larson 1992) presents a design framework which breaks down user interface design into 5 independent layers (structural, functional, dialog, presentation, pragmatic) which allows for a more formal approach which minimizes risk since design decisions are more independent.
"Requirements, Models, Prototypes is a structured process for HCI design for industrial software development teams." (Casaday and Rainis 1996) This process is routinely used by the authors at Digital. The process is based on three groups of design deliverables:
* Requirements deliverables include the following documents: user description, usability goals, scenarios of use.
* The "Models" documents describe the mental models that the users are expected to form of the program and the work that can be accomplished with the program. Constantine's essential use cases are an important part of the work models. A "User Interface Map" document gives an overview of the user interface and the navigation between screens.
* Prototypes: they use both paper and working prototypes to test concepts.
The use of "templates" is a distinguishing feature of their process - templates range from fill-in-the-blanks documents to interface components. The authors suggest that these templates are a good way to capture design experience.
Deborah Mayhew has detailed suggestions on appropriate activities for a development process that integrates usability tasks into the software development. (Mayhew 1996) (Mayhew 1992) She splits the process into four phases and indicates the appropriate points for the usability tasks relative to the typical development tasks:
Typical Development Tasks
high-level requirements definition
define project scope
overall project planning
usability project planning
project team organization
usability role assignment
contextual task analysis
usability goal setting
phase 2 planning
Typical Development Tasks
iterative UI walkthroughs
UI conceptual model design
style guide development
prototype functional design
prototype UI design
iterative requirements definition
iterative UI evaluation/testing
system architecture design
detailed UI design
phase 3 planning
Typical Development Tasks
detailed system design
test plan development
iterative system testing
iterative UI evaluation/testing
customer acceptance testing
phase 4 planning
Typical Development Tasks
In this section, I summarize the development process at Borland as it was described by the designers of Borland's Quattro Pro spreadsheet. (Rosenberg and Friedland 1994)
Minimizing the amount of support needed is important to Borland since it provides free technical support for 90 days and each support call costs between $15 and $30 to service. And usability is a prime marketing consideration for their products. So cost-justification of usability is no problem at Borland. They have a central usability group which reports directly to the chief technical officer for the company. This gives usability a high visibility at Borland and upper management is involved in validating user interface design and in setting usability objectives for their products. The central user interface design group does all interface design for company products - this makes it possible to have cross-product consistency without spending any effort on developing and maintaining a corporate style guide.
Borland's bug-tracking software has a central importance in managing product development. Anyone in the company can enter user interface problems into the bug-tracking system. In order to encourage people working on other projects to try out the various alpha and beta versions of products in development and to report bugs, the company offers cash bonuses ranging between $50 and $500 per bug reported. They also have a separate database dedicated to collecting usability issues from users (support calls, Usenet articles, etc.).
Borland's development process starts with the "product planning and initial design" phase. The design team reviews previous feedback from users and product reviewers and solicits selected users for ideas and comments on what features should be in the new product. They start designing with paper sketches which are discussed by a wide group of people. Some of these ideas progress to prototypes which are tested with focus groups of typical users. Often there are a dozen or more iterations of prototypes. At the end of this stage, the prototype becomes the living specification for the user interface - it is complete enough to demonstrate the overall design even though not all the dialog boxes or controls have been implemented.
The next stage is "product implementation". The details of the interface are designed and implemented. At various points in this development, both heuristic evaluation and formal usability testing is done on the partially completed interface and any major usability problems result in redesign and iteration of the tests.
The last phase is "product completion" which begins with the last round of usability testing when all product features have been implemented. After any changes necessitated by the results of these tests have been made, the user interface design is "frozen" and the user manuals are completed. Further usability testing is often now assigned to an independent consulting company who will do comparative testing with competing products. The results of these independent tests can then be used in product marketing to trumpet usability advantages.
(Coplien 1994) reports on Borland's software process. The company bases its development on small teams of expert programmers with knowledge of the product domain, who rely on iterative development, and who communicate primarily orally via frequent design meetings. One team of 8 people had written about 1 million lines of C++ code in 31 months -- an astounding productivity of more than 1000 lines a week. Coplien found that the team spirit and strong communication were likely factors in their success. He comments that "although the organization has no codified system of process, it is keenly aware of what it does, how it does it, and what works. It views software development as something fundamentally driven by special cases ... and repeatability is not an important part of their value system."
A lot of research has been concerned with building tools to support usability engineering. I mention here only a few examples.
The ITS (Interactive Transaction System) tool (Wiecha, Bennett et al. 1990) is a model-based system that contains four major components: an action layer, a dialog layer, style rules, style programs. The developer builds a semantic model of the interface and then tools are used at different stages of development to automate code generation, etc. The authors claim that their tools can be used to capture even complicated design rules and exceptions. The ITS tool has been used to develop several successful real-world projects.
The GENIUS tool (Bullinger, F[Sinvcircumflex]hnrich et al. 1996) permits automatic generation of user interface from the data model.
Some specialized development environments have been built:
(Butler 1995) describes "a prototype user-centered development environment in which business-oriented components are first derived from models of business processes, then created as reusable software objects for assembly into applications ...".
(Fischer, Lindstaedt et al. 1995) describe "a prototype, domain-oriented environment for developing systems" ... "to design systems that embody a model of the objects users need to manipulate and the tasks they need to perform ...".
The IEEE standard 1074 for developing life cycle processes (IEEE 1991) describes the various activities that should be part of a well-managed software development process without prescribing any one particular software life-cycle. The corresponding ISO standard is ISO 12207 (ISO 1995).
The ISO draft standard ISO DIS 9241-11 (ISO 1996b) describes how to specify measurable usability criteria.
There is an ISO working group on HCI development processes. The current draft document (ISO 1996a) says "This International Standard provides guidance on human centered design activities throughout the life cycle of interactive computer-based systems. It is a tool for those managing design processes and provides guidance on sources of information and standards relevant to the human-centred approach."
The ISO 9000 series of quality standards is rapidly becoming important in industry. There are several books discussing the implications for software quality. One such reference is (Ince 1994).
The Software Engineering Institute's Capability Maturity Model for software development organizations is explained in (Paulk, Curtis et al. 1995).
Prev section Next section Top of document