Abstract
It's the dawn of an age where interactive
functionality and information is available and intertwined everywhere.
The past two decades have been a pre-dawn period where products,
software, environments, functionality, and interaction with information
have gradually converged. What lessons have been learned within
a single consulting design career during this period, pursuing from
the beginning, convergence in these areas?
What are some examples of these design
disciplines overlapping in past projects? How do the constraints of limited physical affordances computing
power, and small displays affect the user experience of devices,
and what are some strategies for designing them to be more successful?
How can a rewarding balance between efficiency, friendliness, flexibility,
and inevitably expandability, be achieved when designing in these
areas? What design strategies can help devices,
their software, information architectures, and network infrastructures
leverage each other for greater rewards?
While by no means exhaustive, or inclusive
of all the many projects I've encountered during my career, this
presentation cuts across a representative sample of my design work. I will use examples from a wide range
of projects as a context in which to discuss issues related to the
above questions, all of which are increasingly important in today's
design challenges. As
a designer that has been among few working in the field of product
and system user experience for this length of time, it is my goal
to offer up some of the diverse ground I have covered and subsequent
lessons this experience has taught me.
Earliest
Exposures to User Experiences and Interaction Design
Interestingly enough I remember very
well the first time I was aware of the User Interface and what I
felt were clever and whimsical ways to communicate function. It was the rabbit and turtle symbols near the throttle control
of a farm tractor. I
grew up on a farm in west-central Missouri, in a small rural community
where the highest technology was generally mechanical and agricultural
in nature. The rabbit symbol, used for fast, and
the turtle symbol, used for slow, struck me as a youngster as very
clever and communicative.
At the time I didn't fully grasp that such cognitive affordances
were not simply graphically clever, but also cognitively simpler
to grasp immediately and more broadly understandable across languages,
and at least cultures where fast rabbits and slow turtles were common.
Iconic Alphabet - 1967
In 1967 I designed my first set of
icons, each representing a letter of the alphabet. I'm happy that I still have the drawing, and am somewhat impressed
by the cleverness and simplicity of many of the symbols I drew. This would be a harbinger of things to
come.
During the 1960s I watched some television,
but our set only brought in two or three channels on a good day,
so many evenings consisted of sitting down with a volume from the
Worldbook Encyclopedia and randomly perusing and reading. This instilled both a love of informational surfing and a growing
appreciation for the concise manner in which information was organized
and presented, both textually and graphically. The experience of growing up on a farm where it was necessary
to use very different, yet integrated types of knowledge, skills,
and thinking, laid an important foundation for my generalist career
as a design consultant. From
early on I saw the integration of information, technology, and experience
"wearing different hats" as crucial to long-term success. Holistic or systems thinking didn't seem very unusual to me,
having been born into a family that had been farming for many generations. To the contrary, it was specialization
that was a much more unfamiliar concept.
Throughout the 1960s I picked up various
images and impressions of computers. They were very mysterious, as I'd never had any opportunity
to see one close up. My
notion of computers was a hazy mishmash of images from the media
- giant mainframes with blinking lights and spinning tape reels
in movies such as Desk Set, space-age computers in Batman's Batcave,
and even the cartoon character, Mr. Peabody's Wayback Machine.
One common image that made an impression on me was the "readout." Usually a tickertape that was issued from
a computer with the answer to whatever problem had been posed to
it. I had no idea how a question was transformed
into an answer inside a computer.
Lesson Learned: Symbols have the power to communicate concepts
Homemade computer and interaction
design drawings - 1971
In August 1971, just as I turned ten
years old, I could wait no longer and decided that I would design
and build my very own computer.
Though no photographs of the "working" cardboard
prototype survive, I still have my initial drawings.
As my understanding about computers at the time wasn't about
number-crunching, but about question asking and answer retrieval,
I pursued a design that would accomplish that.
It's ironic how closely the drawings resemble the format
of many of my later interaction documents, showing sequential operation,
and tying together as many elements as I could to make the computer's
operation simple for people to understand and use.
It was capable of answering six questions.
Each question was written on a "punchcard" (I didn't
know what the punched holes were for, but I included them along
with the written question). Each punchcard was marked with a circle
with a number inside it. That
was to symbolize one of the six corresponding round, numbered buttons
on the front of the computer.
The card was deposited into an "Input Slot" at
the top of the computer (whereupon it was unceremoniously dropped
out the back. Then the user punched the correct button, which was connected
to a push-rod. The
push-rod pushed the answer slip out of a compartment inside the
computer, whereupon it would fall out an "Output Slot"
near the bottom of the computer's front.
Voila! My computer!
I was actually very disappointed.
My computer wasn't magical.
It was fake! I had to set it up first!
I had an uncle that was an electrical
engineer, and who worked for the FAA in Oklahoma City working with
communication electronics and doing radio frequency engineering
for antennas. He happened
to visit shortly after I'd made my computer.
I told him that it wasn't a very good computer, because it
had to be rigged to work.
He then told me that all computers have to be programmed,
and began to explain how computers really worked.
I was fascinated. This was one of the transformative experiences
of my young life.
Another time he showed me how to enter
a 49-line program into his Hewlett-Packard HP35 calculator in order
to play an artillery game.
The game's goal was to zero in on a target, entering an angle
for the artillery round to be fired, and getting back a positive
or negative number indicating having either shot beyond or in front
of the virtual target. You had a limited number of times before you'd be hit yourself.
I was hooked! I became a programmable calculator geek
all through junior high and high school.
Lesson Learned: Computers and Information Systems are not magic!
My broad-based education as an designer and career goals
upon graduation in 1983
All through high school, I had studied
with the intention of becoming an engineer. I loved lab work and word problems, especially drawing elaborate
illustrations and diagrams to accompany my exercises. But I loathed the monotony of abstract
mathematics. I studied
until I grasped the underlying concepts, but being a visual thinker,
I could only love mathematics and physics in their visual, preferably
real life, forms. After a year of college, I took a 180-degree
turn and went into painting, art history, psychology, and economics,
with a goal of becoming a fine artist.
While studying these domains I came upon a book on The Bauhaus,
the famous German design school of the 1920s and early 1930s. What impressed me most was the curriculum as printed in this
book. Mathematics,
materials, typography, dance, fiber, and on and on.
Such a broad and diverse range of things, all important to
pursuing highly-integrated and holistic design solutions. This was what I was looking for - industrial design.
I discovered that the nearby Kansas
City Art Institute had a very strong School of Design. It had been chaired by the well-known
designer and architect, Victor Papanek, who wrote the influential
books, "Design For The Real World" and "How Things
Don't Work." He
brought a real sense of critical thinking to design, prompting young
designers to ask important questions, such as of for whom were they
were designing, and what kind of value did designs and solutions
bring to the larger world.
While at the Kansas City Art Institute
I spent a great deal of time using and hacking around on the School
of Design's Apple ][+, which sat in a small room near the professors'
offices. I managed
to use the program Graforth to build simple 3D wireframes that I
used to establish dimensional perspectives for product marker renderings.
I graduated in December 1983, and just
one month later a product appeared that would bring all of my design
ambitions and goals into a clear path that I've followed to the
present - the Apple Macintosh.
For nearly a year I hung around the store that sold Apple
computers. I was getting freelance design here and there, but all I could
think about was how empowered I would be with a Macintosh.
I managed to get a loan, using my car
as collateral, and purchase a 128k Macintosh and an Imagewriter
dot-matrix printer. The
Macintosh represented to me a most amazing combination of physical
device design tightly integrated with an equally revolutionary onscreen
user interface. I was less interested in the software
GUI by itself as I was the amazing power unleashed by the GUI with
the mouse input device. To
me at the time it was clear that this was the domain that I wanted
to pursue in design - the combination of physical, visual, and informational
interaction. Only trouble was, this was now early 1985
and I was a long way from anywhere where there was work doing user
interfaces.
So again, I decided to design my own
computer. But this
time it would become part of my portfolio as I sought out design
work.
Lesson Learned: Designing across boundaries requires a very broad education
and benefits from generalist experience.
Flat-panel Computer, physical interaction and software
design - 1985
In early 1985 I began developing a
concept for a computer with a positionable flat-panel display. The design encompassed both the exploration
of an alternative physical user interface, the form of the computer
itself, and a software system designed to manage a smart office
building.
After an exploration phase with numerous
configuration sketches, the physical interface was essentially the
functionality of the mouse divided into two devices, one for positioning,
and one for action commands, which would be located on either side
of the computer's detached
keyboard. Both plugged into the keyboard and could
be moved from side to side, depending on the user's dominant handedness.
The goal was to speed up the interaction.
In observing user playing video arcade games, I had become
fascinated with the speed and efficiency of their physical interfaces.
I was trying to reproduce that intense, fast, and efficient
physical interaction with this design configuration.
The software portion of the project
was developed using MacPaint, with the screens iteratively developed
and arranged into a sequential storyboarding of the interactional
flow. The software covered a range of features
and activities pertinent to monitoring and managing the various
infrastructural services of an office building, including electricity,
lighting, plumbing, temperature, and elevators, as well as the building's
telephone switchboard and directory.
In observing the receptionists of corporate
office buildings, I thought that speed and efficiency could be increased
by placing all of these functions in a centralized console-style
arrangement. Access
and interaction would be minimized to short, direct navigation to
a function and rapid sequential operation.
A hands-free headset would allow the user to use both hands,
quickly shifting from monitoring, to managing, to providing telephone
receptionist functions.
Lesson Learned: The deepest, most impacting user experience design involves
designing the physical controls, visual interface, and overall interactional
architecture together, as a tightly integrated whole system
Interaction Design and Information Architecture design
consulting work from 1985 - 1989
Throughout the last half of the 1980s
I lived in Dallas, Texas, and worked on a wide range of products
and systems, large and small.
I often worked as a subcontracting co-consultant along with
older and more experienced designers.
Not only did they help me gain experience across a wide range
of devices and systems, but also taught me the skills of the consulting
business.
I did quite a lot of prototyping, including
electronics such as calculators, toys, computer monitors and medical devices. I also worked on some military projects
through contracts with Texas Instruments, such as an early wearable
GPS device for the Army and a cockpit control trainer for the Navy. But I also worked with others to build
even larger things, such as a full-scale trailer-mounted prototype
of a Gulfstream G-IV business jet.
These experiences helped me develop a deeper understanding
of hardware design and manufacturing.
For industrial design projects such
as a Federal Reserve bill-sorting system, which I worked on for
Recognition Equipment Incorporated, and an automated blood analyzer
that I worked on for Abbott Laboratories, I conducted extensive
design research. I studied the target users' goals and created full-scale
foamcore mockups on which I did link-analysis and ergonomic studies
to examine various configurations and their benefits.
Lesson Learned: The best design education is hands-on experience in all the
various aspects of development and working alongside older, more
experienced and mentoring designers.
Development of the Open Lookª GUI for Sun Microsystems
- 1988 - 1989
In 1988 I teamed up with Norm Cox and
Alan Mandler to develop Sun Microsystem's Open Lookª GUI. Norm was a graphic designer, who had been
the original icon designer for the Xerox Star computer, developed
at PARC in the 1970s. I'd
been working with Tom Noonan, and industrial designer that had designed
the mouse for that system.
I was getting to work with some of the pioneers of the field.
At the time it was a big decision to
work on a project that was completely software, but I knew that
opportunities to design anything as fundamental as an entire OS
GUI just didn't come along very often.
It was an enormous learning experience.
My role was to take the basic components and interactions
as Alan and Norm were developing, and refine and document them,
developing all the layout and behavioral rules.
We were also working on IBM's OS/2
Presentation Manager GUI at this time, which allowed me to meet
a number of interesting people also consulting on that project,
such as Edward Tufte. This was an era when several OS GUIs were
being developed - the Next GUI, Hewlett-Packard's OSF Motif.
At the time I thought that these were
all well and good, but they just weren't fundamentally breaking
any new ground. I still
remembered the giant leap that the basic GUI user experience had
been over the former command-line experience.
I wanted to know what lay beyond the current desktop experience.
Lesson Learned: The disappointing fact that the majority of design is not in
groundbreaking innovation, but in variations on something successful.
InfoSpace, the whitepaper and information strategy - 1988
- 1993
Again I would develop my own concepts
to explore a wide range of ideas I was having. The result was the InfoSpace project,
which I began in late 1988 and worked on through 1991. I presented the paper and interface concepts
in 1993 at 3CyberConf - The Third International Conference on Cyberspace
- held at the University of Texas, Austin.
My experiences with Open Look and growing
interest in what constitutes revolutionary innovation (i.e.: innovation
that was capable of opening up entire new worlds where many types
of embodiments could emerge) prompted me to explore several ideas
and approaches in the InfoSpace model:
1) A network browser model.
2) Integration of the browser into
the OS GUI itself
3) Expansion of the 2D desktop into
a "space" which supported large amounts of interactively
visualized data and elements.
The model described the use of metadata (what I termed "metainformation")
of both objective (filesize, date of creation, owner, etc.) and
subjective (3 out of 5 rating, categorical placement) that could
be mapped to visual, spatial, and/or behavioral attributes within
the visualization. In subsequent models I extended the metadata
concept to blend both metadata that an informational object carried
with itself, or which was produced by the object's creator(s) to
the idea of metadata being able to be gathered from distributed
third parties in a "secondary query" (e.g.: issue a primary
query, which brings back a number of hits, then issue a secondary
query, that brings back distributed metadata associated with the
initially returned data objects, by specific object or associated
classes).
One of the primary goals of the InfoSpace
project was to explore and develop data retrieval methods that would
utilize active exploration and interaction, and leverage the powerful
human visual cortex to spot and examine interrelationships in large,
dense data sets. Even
today the predominant approach to search appears to have as its
goal the return of a reduced set of hits.
I term this the "finding the needle in a haystack"
approach. Conversely, InfoSpace was "pro-hay,"
and sought to develop ways to interactively visualize thousands
of returned hits in simple, yet powerful ways with the goal of being
able to tease out salient interrelationships and spot candidate
hits scattered amongst the returned group that might not otherwise
be easily reachable by a simple hit lists, which could require tediously
wading through hundreds of pages.
My reasoning, which I still find completely
valid and underexplored in the information architecture and user
experience design fields, is that leveraging the human visual cortex
with interactive visualization will be many times more powerful
than using artificial intelligence, agents, or simple schemes such
as "page rank" to pre-digest large data sets and return
a few suggested hits. I suggest that this is particularly true
when it comes to exploring the interrelationships that are inherent
in large data sets by interactively controlling the visualization
of different metadata attributes and their associated aspect that's
part of the aggregate visualization.
In the late 1990s I created a series
of illustrations to demonstrate these ideas and user interface widgets
that would support them. While
such methodologies have not yet emerged to become part of the general
part of the computing experience, let alone be built into the very
desktop/space itself, I feel that eventually the field will come
to realize the inherent power in leveraging interactive human visualization
in addition to the reductionist list methods that are dominant today.
Lesson Learned: Technologies that enhance, exploit, or leverage existing and
powerful human capabilities are often overlooked in favor of those
that promise that machines will do the "heavy lifting"
and provide us with an "answer."
Device, software, and system design projects from the
Early 1990s
In 1989 I moved from Dallas, Texas
to Palo Alto, California in order to join the Silicon Valley community
where I felt I would have the greatest opportunities to design both
devices, software, and integrated combinations of both.
This turned out to be a very good strategy, as I quickly
discovered many different type of projects that fit the goals of
my design consultancy.
The Acuson Sequoia Ultrasound System, physical control
design strategy and display
interface - 1990
One of the first Bay Area design groups
I met and developed a co-consulting relationship with was Lunar
Design in Palo Alto. Lunar
was just beginning a large-scale industrial design project with
Acuson Computed Sonography in Mountain View, California to design
their next generation Sequoia Ultrasound System.
I had shown up at a very fortuitous
moment, as they were just being confronted with the design problem
of developing the new ultrasound system's user interface. Like many projects then, and even today,
the user interface was seen as two separate parts - the physical
controls and the visual software display interface. The physical controls interface fell into the hardware/industrial
design category, so it was in this area that I began my work, conscious
that I would have to discover a way to design both in an integrated
manner if the whole user experience were to be successful to the
degree I felt was achievable.
After observational studies of users
and their operation of previous generation ultrasound systems, I
noted that most of the functional modality, rather than being integrated
together and accessible in an organized and consistent manner, was
implemented in a manner that insured it would be difficult to learn
and access. Many separate modal functions were accessible
only by second- and third-level functions via the system's keyboard
keys. Once invoked,
their use was often the crudest implementation of the raw underlying
program operations, as opposed to approaches that were designed
to be the most easy for an operator to learn and use efficiently.
Ironically, in interviews with experienced users, I often
found an attitude that defended this difficult-to-learn system. After all, they had already scaled that
nearly vertical learning curve, and so saw any attempt to simply
its usage as an impediment to their learned skills and power-user
speed. I was determined to serve both these needs with a good design
solution.
Link analysis studies revealed a great
deal of physical movement between widely-placed controls, and operational
sequences that required far too many tedious steps. Through iterative design and modeling, this led to a concept
I dubbed "centralized interaction," which meant that the
physical controls would be brought together in a centralized place,
and create a user experience that was more of a "driving experience"
than one where every operation had its own idiosyncratic interaction
strategy.
Since this was to be a long project
(it took over six years), and I and Lunar both wanted to demonstrate
some of the interaction concepts that were being developed, we jointly
created a conceptual project dubbed "Modus Operandi,"
which consisted of a well-designed centralized interaction device
along with an associated, and more user-friendly software approach.
Instead of sonography, I chose the example of a PET scanner,
as I wanted to stay within the realm of medical visualization.
Concurrent with the Modus Operandi
project, I and Jeff Smith, one of Lunar's co-founders, co-wrote
an article for the April 1991 issue of Medical Design & Materials
entitled, "Applying The Interaction Design Approach to Medical
Devices." In it were presented the basic concepts
and methodologies that we were developing in order to tightly integrate
industrial design with the overall user experience, involving hardware,
software, and information.
The centralized interaction model was
adopted for the Sequoia Ultrasound System, and years later in 1999
when the Sequoia system was awarded a Silver in Business Week/IDSA's
Designs of the Decade competition, one of the jurors, Dr. Lorraine
Justice, IDSA from the Georgia Institute of Technology made the
following statement: "The well-designed product interface
was a major contributor to the success and usability of this product.
It has helped to set the design standard for medical products
for the latter half of the decade."
Lesson Learned: It is worth taking measured design risks and pushing beyond
normal development practices in order to achieve new levels of integration
and usability.
Syntex Laboratories Gestural slate-device virtual office
software for pharmaceutical
salesforce - 1991
In 1991 I was hired by Syntex Laboratories
to design a gestural pad-style-computer interface to support the
various needs of their large pharmaceutical sales force. The lead consultant had been Alan Mandler,
with whom I'd worked on Sun Microsystem's Open Look GUI a few years
earlier. Mandler had
proposed a very unique, and yet incredibly easy to learn and use
"virtual office" design.
Rather than foist a complex command-line, or early Windows
version software application on these users, which ranged from recent
college graduates to older salespeople with limited computer experience,
we set out to create a simple virtual office capable of handing
their tasks in ways more familiar and accessible.
In addition to design research, where
we discovered the range of computer skills and learned about the
functions that needed to be addressed, I began iteratively diagramming
the system's interactional architecture and mapping functions to
elements in the virtual office environment.
This occurred during the period when gestural slate computing
was emerging as a possible alternative to the desktop GUI model. It was also prior to Microsoft's introduction of PenWindows.
Many believed that PenWindows was introduced primarily to drive
other slate-style efforts out of business, as they represented a
threat to Microsoft's growing hegemony in the business computing
world. Whether this
was true or not, after the demise of companies such as Slate and
Go, PenWindows did not go on to become a major system.
The approach we were taking with the
Syntex salesforce project was truly unique, however. It may have become a great success, had it made it to the finish
line, but another business reality intervened and changed everything.
The patent Syntex had on their anti-inflammatory drug, Naprosyn
(naproxyn sodium), was due to expire.
This drug brought in over a billion dollars a year, but upon
expiration of the patents, and introduction of over-the-counter
versions (Aleveª), Syntex was rushing to get past the remaining
FDA hurdles to launch a new drug.
Just before their patent expired, their new drug failed to
achieve FDA approval, and Syntex was sold to Roche Biosciences. The project was canceled.
Lesson Learned: Information architectures can be embodied in a wide range of
interactional architectures, including virtual representations of
real world environments.
Lesson Learned: Sometimes, even promising design efforts can be done in by external
business forces
Axcis Pocket Information Systems Trackmasterª -
1992 - 1993
Trackmasterª was horseracing betting
and statistical management software, implemented on an early palmtop
device the Hewlett-Packard HP65.
I was connectable, via dialup modem, to a horseracing database
and information network maintained by Axcis, then a startup company.
The challenges were formidable. The functionality was deep, the visual
and physical interface small and highly constrained. Added to this was a user demographic with an average age (in
1992) of 56 years old. Granted,
this was a highly motivated demographic.
Statistical information, and the means to easily and portably
access it, provided an edge to those betting and so they were willing
to go to great lengths to learn, if it meant an advantage at the
track.
This didn't mean that the interface
couldn't be improved, however.
When this client first called me, like many others before
and since, the conversation began with the familiar, "Everything's
finished. We only need the user interface."
This often indicates a doomed project, though after examining
it I determined that good design would convince them that this could
not be a superficial fix atop their current system.
What they envisioned as a quick two
or three week project turned out to require about seven months of
intense work. Like
many interfaces on devices of this era, the interface graphics had
to be broken up to be implemented as character cells.
As such, it was extremely tedious to create the graphical
aspects of the interface, but the majority of the program was implemented
as navigable spreadsheets and lists.
This device was, at the time, the smallest
and most constrained environment in which I needed to provide both
a general interactional architecture as well as an information architecture
where a wide range of dynamically-changing data was easily accessible
and usefully interactive.
The TrackMaster device, which could've
easily failed from sheer unusability, proved intuitive and powerful
and helped to establish Axcis as a leader in the field of horseracing
handicapping databases.
Lesson Learned: Significant hardware and software constraints are not insurmountable
barriers to successful usability
Apple Computer Home Computers of the Future for Disneyworld's
Epcot Center
Innovations exhibit - 1994
In early 1994 I was approached by Apple's
Director of Design to conceive and develop two conceptual OS GUIs
for home computers Apple was contributing to the Magic House, part
of the larger Innoventions exhibit at Disneyworld's Epcot Center. Both were flat-panel displays, with one hanging beneath kitchen
cabinets, and one was a desktop model that represented a home office
computer. The physical
computer models were designed and prototyped by Lunar Design.
The first version was a canned demo
that synchronized with a Disney character that moved between video
monitors stationed along the tour path.
Later on, I and my team turned the interfaces into interactive,
touch-screen enabled demos that visitors could actually operate
themselves.
Working with a graphic production designer,
I designed two embodiments of essentially the same OS GUI model. These demonstrated several applications
apiece. The kitchen-based
computer featured applications such as email, video post-it notes,
home energy monitoring and control, an interactive cookbook, online
grocery shopping, an Internet browser and concepts for weather and
traffic conditions, maps, and driving directions.
The home office computer featured some of the same applications,
such as email and Internet browsing, but also financial management
and desktop publishing.
Though the web was only in its early
infancy at the time, the applications conceived and embodied in
these two prototypes were fairly prescient, and accurately predicted
the kinds of services that eventually emerged, both on the web and
as personal applications.
The singlemost feature that these two
models portrayed that eventually emerged almost identically however,
was what we now know in OSX as the Dock.
In the two models, there were large icons parked at the bottom
of the screen, representing applications that the user most frequently
accessed. When invoked, they would "morph"
open, just as the modern OSX Dock icons do today.
While this was a complex project, it
did not have a large budget, nor the amount of time necessary to
conduct extensive research beforehand.
This was a project that required successfully extrapolating past knowledge and familiarity with where
then-current trends and technologies would lead, and then creating
a bold conceptual vision.
It turns out that my concepts were very accurate.
Lesson Learned: It is very definitely possible to perceive and extrapolate
user needs and develop successful interfaces without extensive user
research, if one is adept at understanding generalized patterns.
Starsight Telecast Electronic Programming Guide System - 1994 - 1997
The Starsight Telecast project was
an important turning point in my design consulting career. It marked the first opportunity I had
to expand my design consultation to the development of strategic
intellectual property, covering a television electronic programming
guide GUI, a remote control strategy, and the data model that enabled
much of the interactive functionality.
On previous projects I had sometimes
been asked to review patent applications, and this made me realize
that much patent work was done after a design had been done and
often missed strategic opportunities.
By exposure to patent documents I taught myself enough about
how claims were constructed to begin offering this as an addition
to my design work. I began telling prospective clients that I would not only design,
but design in such a manner as to maximize the product or system's
strategic positioning in the marketplace.
This was a rather audacious claim on my part, but I felt
that such an approach was not only possible, but potentially more
valuable as a consulting service, were it possible to do it successfully
and then use that to do more of the same.
The project stretched over two and
a half years, with major iterations being developed on roughly six-month
cycles. the industry
buzz at this time was centered around concepts such as virtual shopping
malls that a user was supposedly going to be navigating around in,
or even more unlikely scenarios, such as being able to click on
the clothing of an actor and then being able to order it.
I looked at such ideas as intriguing,
but highly dubious and missing the opportunities that lay in exploiting
some of the features already in contemporary onscreen guides. Rather than try to design past the grid-based
guide, I embraced it as an elegant and compact manner of accessing
program information. My
strategy began by attacking the remote control, which was a button-laden
monster that had many of the same drawbacks of the medical equipment
interfaces that I'd encountered previously.
My goal was to create a "steering wheel for the thumb"
as opposed to a button for every individual function.
This would then work seamlessly and efficiently with an interactional
architecture developed for the program guide's GUI.
The first thing I did was constrain
navigational movement to primarily an up-and-down vertical scrolling. The result was a remote control that had
a clickable thumb-roller, surrounded by directional stepper buttons
for moving around areas on the screen.
This eliminated the majority of unnecessarily tedious interactions,
such as the act of moving a free cursor around a screen and trying
to target objects. This
also fit into a viewpoint I had regarding information in general,
which was that it could always be reduced down into scrollable lists,
which could be interacted with much more quickly and efficiently.
I wanted a system that was much faster to interact with,
and based on much simpler and fewer types of interaction.
Next instead of abandoning the Grid
Guide, I dived into it with a concept we later patented and which
I called "Contextual Linking."
Put simply, this meant that as one moved about on the grid
guide, or up and down a compiled list, from program cell to program
cell, clicking would bring up another list that contained both actions
that could be performed (Tune to This Program, Record This Program,
Remind Me, plus, listings of links, purchasable items, or as many
other associated things as could be linked in to that program or
its parent channel).
The Contextual Linking concept then
required a data model behind the system that allowed for this linking
to be added to the raw flat files that constituted the basic channel
guide information. Starsight
Telecast had been simply buying this raw information and distributing
it to their settop boxes, with no additional information being added.
A series of prototypes were built from
this project's iterative cycles, and numerous patents were filed. In 1997 the company was acquired by Gemstar,
and the project development was halted. The intellectual property was folded into Gemstar's immense
patent chest, and to this day remains unbuilt, which is unfortunate. The goal had been to bring an entirely
new level of power and interaction to the television-based environment.
Lesson Learned: The most powerful design solutions do not always involve abandoning
current methods and embodiments.
Lesson Learned: While strategic design and intellectual property development
is a smart approach to development, it can sometimes lead to good
ideas being stored away unimplemented in patent warchests.
Nike Triaxª Series of Runners' Watches, development work
on the physical and visual
interface - 1996
In 1996 I was asked by my friends at
Astro Product Design in Palo Alto to consult on the development
of the interactional aspects of the line of runners' watches they
were designing for Nike. I
was excited about this project, as I could see that the watches
were going to be incredibly cool and the obvious visibility of the
project would be an excellent showcase for good user experience
integrated with good industrial design.
Over the course of two or three months,
I and my associate, Linda Pomerantz, analyzed the required functionality,
and developed a screen layout and general pattern for interaction. As simple as the resulting system was,
the process by which we iteratively developed the interactional
patterns was very hard work.
As with many products, the interface's
physical buttons had been established prior to the visual interface,
and so it was a constraint to map the necessary functional interactions
to the existing number of buttons. Luckily, these turned out to be a good configuration, and it
was possible to order the functional modes and their associated
interactions in consistent patterns and sequences.
In the end, the product was extremely
successful. It was
named by Business Week and IDSA as Consumer Product of the Decade
in 1999. At the time I was very disappointed, as
the industrial design and designers were celebrated widely, but
the effort that went into the development of the user experience
was not recognized or credited.
As I struggled to get exposure for interaction design beyond
the familiar media, web, and software interfaces, this felt like
a terrible missed opportunity for exposure to these ideas and their
value in whole product design.
Lesson Learned: Good interaction design is a discoverable quality, unlike product
aesthetics and styling which are instantly visually evident.
Lesson Learned: Good interaction design is not nearly as noticeable as bad interaction
design or the lack thereof altogether.
Coherent UltraPulse Encore Surgical Laser Touchscreen
Interface - 1998
This project entailed the research
and development of a touchscreen user interface for a surgical laser
system. Yet again,
this was a project where a significant effort had already taken
place on the system's industrial design, and the user interface
was considered a separate effort to be done afterward.
In this case, however, the interface was entirely restricted
to a touchscreen display, and so didn't involve mapping functions
to a set of physical controls that had been established without
regard to the whole user experience strategy.
This project was also an example of
how companies can understand the importance of industrial design,
while not recognizing the inherently interrelated importance of
a product's user interface.
This was a project that I had to sell pretty hard to get
the chance to do. The client was more than willing to release the product with
a very crude and poorly thought out interface, which could very
well have seriously hurt the product's overall usability, performance,
and overall success.
The first step, after extensive analysis
of the product's functionality and general operation, was to observe
surgeons using lasers in the operating room. This device was used primarily in cosmetic surgery, and had
two primary modes of operation:
1) Continuous Wave, which was a focused beam used to cut
like a scalpel, and 2) UltraPulse, which delivered the laser in
pulses, and configurable as different patterns with variable sizes
and densities. UltraPulse was used to ablate the surface of the skin. A common procedure, and one I and my associate,
David Tu observed, was cosmetic surgery around the eyes. The Continuous Wave mode was used to cut
into the inside of the lower eyelid to remove the fat globule beneath
the eye. And the UltraPulse
mode was then used to tighten up the skin and smooth out the wrinkles
around the eyes. Different
patterns were chosen to, essentially, anti-alias around the eyes. This was a fascinating procedure to observe.
We also noted in detail the type of
interaction between the surgeon and assistants in before and during
the procedure. This
led to some important insights about how the device was reconfigured
throughout the procedure and how the settings were viewable by the
surgeon and assistants.
The design strategy centered around
discovering the important information about the device settings,
and how these could most easily be seen and understood.
There were also important interrelationships between certain
settings such as Power (in joules) and Pulse Rate (in Hertz, or
pulses per second). These two settings, though independently
configurable, were also interrelated in that at higher powers, pulse
rate was limited, and vice versa.
In earlier models, this relationship was not visually apparent.
In our final design, these two settings
were embodied as two segmented meters, with bright segments indicating
the level currently set, and a secondary, darker segments indicating
what was possible given the other related settings current setting. So for example, as the Power was raised,
the display for Pulse Rate would show the darker segments decreasing.
The other design insight was that the
surgeon and assistants understood the devices state as a series
of numbers representing the multiple settings for power, pulse rate,
pattern, size, density, etc..
This led to the development of the QuickLook design strategy. This meant that the GUI was designed so that these values were
aligned in an easy to see vertical arrangement, and contrasted from
the rest of the interface.
The two primary operational modes were selectable via a simple
tab method at the top of the screen, and the entire interface was
designed to reflect the styling found in the industrial design.
This led to an interface design that was both extremely usable
and also integrated well aesthetically.
The Medical Group of Coherent was later
spun off and sold as a separate company, becoming Lumenis. The UltraPulse Encore (it's final product
name) went on to be a very successful surgical laser, and was commended
for its excellent usability by surgeons who used it.
Lesson Learned: It can be worthwhile to push a potential client to undertake
a design program they didn't previously think necessary.
Lesson Learned: It's valuable to reinforce informational and control interrelationships
visually and interactionally beyond simple functional requirements.
Kensington VideoCAMworks computer videocamera software -
1998 - 1999
In late October 1998 I was approached
by Kensington to design the companion desktop software for a computer
videocamera being designed by my friends at Astro Product Design. This time, the device itself had no substantial
user interface. But
there was a good deal of functionality that needed to be embodied
in the software, both operationally, and in terms of data management.
Contemporary consumer-level webcam
software at the time was generally limited to basic monitoring and
frame/video capturing, and embodied in basic OS GUI widgets. The Kensington VideoCAMworks also needed to do these functions,
but also include such functionality as simple videoconferencing. In the course of my design work, I also
added two additional functions which were quite fun to play with
- the ability to create time-lapsed videos and stop-motion video
animations.
The software was required to be simultaneously
released on both Windows and Macintosh platforms, and so rather
than design and build the software out of standard GUI components,
I opted to create a much richer and more "console-based"
application.
A console-based application is one
that is configured primarily as a single screen with interactive
panel and/or other elements.
The design of VideoCAMworks evolved to include a background
and several fixed elements, including the camera display panel,
a "tray" where newly created image or video icons would
appear for subsequent review or management, and two slide-out-and-back
panels that represented a standard file tree explorer and a directory
view. Across the top
of the console was a group of icons, which acted as drop targets
for File, Delete, Open (in addition to double-clicking on the icon
in the Tray), Print, Fax, Email, and Create New Collection Directory.
The application was designed to be
intuitive and learnable by simple exploration and interaction. Almost all the functions were live, with
rollover state changes, and simple toggling behaviors. Never was there a possibility that the
user would be suddenly whisked off to an unfamiliar location or
window. Repeating a click on a modal button would
often toggle back to the beginning state or mode, with accompanying
visual context. I dubbed
this approach, "Play To Learn."
Being someone who's rarely opened an instruction manual of
any type, I feel many interfaces, and certainly those associated
with consumer electronics, environments, software, and systems,
should be learnable by a natural exploration process, instead of
requiring rote learning of complicated instructions.
Several complimentary strategies go
into imbuing an interface with a Play To Learn quality and these
are the lessons that were learned and put into practice in this
product:
Lesson Learned: Break the functions down into as few major divisions as possible
and symbolize these divisions in an instantly recognizable visual
and spatial manner. This
doesn't mean that the user must instantly understand these divisions,
but they must be able to look at the system, or play with it for
a very short time and be able to intuit that there are two or three
or four major elements or divisions existing in the product.
In the VideoCAMworks application, the Directory Panels on
the left, Main Video Display feature on the right, Tray at the bottom,
and Drop Target Icons along the top constituted this division.
It was instantly visible to a user, and hence easier to approach
as separate functions, making it easier to understand the functional
interrelationships within the application.
Lesson Learned: Don't make users spelunk. Spelunking is the term giving to cave exploration. Now many people enjoy strapping on a headlamp
and shimmying down into muddy and dark holes and exploring cave
systems. But few people
receive the same enjoyment from a similar activity in a new and
unfamiliar piece of software or device.
Spelunking occurs when navigation is required within a complex
and hidden series of windows, modes, and hierarchies.
When in one specific place, there is no way to envision the
rest of the functional map, or the interrelationships it might otherwise
reveal. So the user is left to grope about blindly, ever so slowly
building up their own mental map of the interactional cave system. Users usually don't emerge after such
spelunking systems muddy, but rather frustrated and disappointed.
The key to avoiding spelunking, is
to either create a console-based system where very little navigation
is required, or provide some contextual overview as to where a person
is, how it's related to the rest of the system, and make as many
functional interactions as consistent and similar in pattern as
possible. In the VideoCAMworks application, I chose to make it a console-style
application, as I felt that better suited the "playing around"
factor. Everything
the user needed was "right there." Most functional alternatives were within one or two button
clicks of visible elements away.
Lesson Learned: Don't allow users to be run off a cliff. An interface that allows a user to get
fouled up and lost is one that's poorly designed. Often a user will activate a button and suddenly find themselves
in another place. This
is a common hazard in spelunking.
They don't fully understand where they are, and may not even
fully understand how they got there or how to go back. In the very worst interfaces, this may mean committing an irreversible
action. In many, it
simply means the user must trudge back to the beginning and start
again, now having learned not to do that *the hard way.* It is by such grueling interaction that many interfaces become
understood.
Plunging off cliffs can be avoided
by making many actions of a toggling nature. Press a button and something flips to the left and displays
certain functions. Press
it again and it flops back the other way, showing the original functions. Flip-Flop-Flip-Flop - "Oh, okay,
I get how *this* works." If it's a loop of modes or access
to functions, then providing a secondary indicator of where the
user is in the whole loop greatly helps establish the necessary
context for understanding the whole pattern.
Having been self-taught as an interaction designer, such
concepts have always seemed like just plain common sense to me,
though they're absent from nearly all interfaces of any significant
complexity.
Polaroid Digital Camera Concept Design - 1999
In 1999 I took on Polaroid as a client,
working with their internal product design group to develop a wireless,
dockable digital camera. My
role was to develop the user experience, which in this project would
involve an integration of both the physical and the visual interfaces
together. This opportunity came about as a result
of my earlier work with David Laituri,
an industrial designer who'd formerly been with Lunar Design
(Laituri was the Industrial Designer on the Apple/Disney Epcot Computers
in 1994). He was now Director of Design at Polaroid,
and as such had the experience and foresight to take the design
of this camera to a new level - one that included approaching the
industrial design and interaction design as an integrated whole
from the beginning.
Polaroid's design group had already
completed an extensive amount of design research and had developed
a marketing requirements, a generalized conceptual embodiment for
the device and associated peripherals and enabling technologies.
I started with the information and concepts that they had
developed, and began applying a combination of physical and visual
interactional strategies that I'd developed in the course of many
earlier projects.
The first and foremost strategy I employed
was to create an embodiment of a "Drivable Interface,"
which is one that can be "driven" or operated without
needing to look at the physical controls.
Drivable Interfaces are those, such as the StarSight Telecast
remote control strategy, and the earlier Modus Operandi centralized
interaction device, which allow the hands to rest upon a limited
number of physical controls, and thereby free the eyes and cognitive
attention to remain focused on the visual interface and the functional
goals. Drivable Interfaces facilitate a more transparent and fluid
user experience. In
early 1990, when working on the Acuson Sequoia project, I had appropriated
the German term that Volkswagen had been using in their advertising
campaigns, "FahrvergnŸgen," which meant "Driving
Experience," to describe the qualities of the experience I
was trying to create.
Since the mid-1990s, I'd been developing
a number of concepts around the use of Drivable Interfaces, and
one of the primary advantages I saw was in environments such as
the automobile, where the driver needed to keep attention focused
on the road ahead, and in small devices where external displays
could be replaced by optically-enlarged smaller internal displays.
If a device can utilize a small internal display, and enlarge
it by an eyepiece lens, then this necessarily requires a Drivable
Interface, as the user will not be able to pull away in order to
locate one of several physical buttons. In devices such as telescopes, this could
have the advantage of avoiding undue light pollution from an external
display. In devices
used during in bright sunlight, it also avoids the problem of glare,
which makes many external displays ineffective.
The physical controls consisted entirely
of a thumb-operated clickable roller wheel and an index-finger-operated
trigger button.
The second interaction strategy utilized
in this project was use of a ubiquitous Symbolic Interaction Guidance
Language, or symbols used throughout the GUI to show how the few
physical controls were mapped to functionality at any given point
during operation.
The GUI displayed small, simple symbols
of the roller wheel with a rolling arrow indicating downward or
rightward scrolling, an rolling arrow indicating leftward or upward
scrolling, and an inward-pressing arrow symbolizing the roller being
clicked.
One aspect of the user experience that
I felt was particularly elegant, was how the camera head could be
twisted around 180 degrees, allowing it to be held either vertically
or horizontally, and the internally-displayed interface would automatically
switch back and forth between portrait (vertical) and landscape
(horizontal) configurations, while operating in exactly the same
manner.
Patents were filed on the basic user
experience model, however Polaroid's business misfortunes doomed
the otherwise promising project.
Again, potentially successful design strategies were done
in by extenuating circumstances.
Lesson Learned: A smart, holistic user interface strategy can solve problems
beyond the immediate user experience.
Avica Technology MotionStoreª and FilmStoreª Software,
digital commercial cinema
software - 2000 - 2001
This project involved the design and
development of movie theater media distribution, playlist management,
projection, and electromechanical lighting and curtain control software. Avica Technology Corporation had developed
digital film playback technology that replaced the traditional film
reels historically used to record and playback movies.
The interface I was developing was
purely software-based, utilizing a touch screen, and needed to be
operable by a wide range of theater employees with little or no
training. This presented an excellent opportunity
for a Console-style user experience.
The result of iterative design exploration
and development over several months, led to an interface that operated
both like a virtual control panel with two primary operational modes:
Manager Mode and a greatly simplified Operator Mode. In Management Mode, the console-style interface also contained
some more familiar mouse-and-cursor operable elements (in Management
Mode) for the building and editing of playlists, though these elements
also supported touchscreen operation.
In Operator Mode, the interface was limited to simple visualization
of the playlist playback progress, and simple Start, Stop, and Resume
functions.
Actual operation, such as playback
and monitoring, was very straightforward, simple, and visual. The edited playlist appeared like a large
horizontally-configured thermometer-style gauge, which showed the
playlist elements and electromechanical operations visually along
its length. As the projection played and electromechanical
events were triggered, a progress indicator moved from left to right
along this virtual gauge.
There are parallels between the Avica
software and the Kensington VideoCAMworks software, in that both
were developed to work in a console style, rather than use standard
OS GUI widgets. In
the case of the Kensington VideoCAMworks software, the aim was to
stay away from a computer-centric model, and invite play.
In the Avica software, the reasons were more environmental. Users were not generally sitting at a
workstation, but were often nearby monitoring the operation from
several feet away. When
there was need for starting, stopping, or other intervention, the
controls needed to be clearly visible and accessible quickly and
simply.
In Management Mode there were more
information-centric tasks, such as navigating a standard file explorer
tree to find files of movies, trailers, no smoking advertisements,
and pre-movie slideshows or features.
These were then dragged directly onto the playlist gauge,
where they would be visually represented as a segment proportionately
indicating their relative length in time.
Similarly, markers representing such electromechanical events
such as the opening of curtains, dimming of lights, etc., could
be dragged from directories where they were listed, and placed a
the desired spots in the playlist.
In the end this was a very powerful
system that was incredibly easy to manage and use. Today, Avica Technology Corporation is the leading provider
of technologies and services for Digital Cinema post-production,
distribution and exhibition.
Lesson Learned: It's possible to successfully integrate desktop-style and console-style
GUI elements in a system-control interface.
The Natus ALGO¨3i Newborn Hearing Screener - 2002 - 2003
The Natus ALGO¨3i was a milestone project
in design career. It
represented the first complex product that I directed all aspects
of design on, from early concepts through final manufacturing, including
the industrial design, physical ergonomics, control button language,
display interface, and information architecture, and infrastructural
strategy. This was a product in which I had the
opportunity to bring all of the many aspects key to a successful
product and user experience together in a well-integrated whole
design. I consider this product and project to
be the most significant of my design career so far.
In 2004, the Natus ALGO¨3i Newborn
Hearing Screener won both an MDEA2004 (Medical Design Excellence
Award) and a Bronze IDEA (Industrial Design Excellence Award - co-sponsored
by Business Week Magazine and the Industrial Designers Society of
America).
The ALGO¨3i Newborn Hearing Screener
is a small, portable, hand-held device used to screen the hearing
of infants between the ages of 34 weeks corrected gestational age
and six months.
Hearing loss is one of the most common
problems found in newborns, affecting approximately 12,000 babies
born in the United States each year. Early detection of hearing
impairment is important because studies show that children with
hearing loss detected at birth can, with appropriate help, learn
and progress at a rate comparable to children with normal hearing.
The ALGO¨3i screener can be used to determine the status of a babyÕs
hearing within hours of birth.
The ALGO¨3i uses proprietary Natus
AABR technology. The screener sends soft clicking sounds to the
babyÕs ears through NatusÕ specially designed earphones. Sensors
placed on the babyÕs skin pick up the responses to the sound from
the babyÕs brain and send them back to the ALGO¨3i screener, where
they are analyzed. The screener then automatically gives a ÒpassÓ
or ÒreferÓ result.
Weighing in at 2lbs, the ALGO¨3i screenerÕs
compact size makes it perfect for screening in a variety of settings.
The ALGO¨3i was to be a greatly miniaturized and portable handheld
improvement to the existing ALGO¨3, which was based on a laptop
permanently mounted to a large rolling cart.
This portability of the new design would allow the ALGO¨3i
to meet the needs of a much broader and diverse user demographic.
The battery operated ALGO¨3i was designed to be used in a
newborn nursery, doctorÕs office or at home. For easier transportation,
a custom-made backpack/carrying case was designed to transport it.
Not only would the device function in a portable or stand-mounted
configuration within hospital and clinical settings, but could also
be carried by nurses, technicians, and midwives directly to the
homes of newborns, thus expanding the number of potential screenings.
First and foremost among the design
goals for the ALGO¨3i was a successful user experience. The wide range of users anticipated worldwide
for this device demanded an integrated software and hardware interface
that was universal where possible, localized for language where
necessary, extremely simple to learn, and highly efficient to use. To this end, the human interface and industrial
design efforts were tightly coordinated and concurrently developed
in an iterative conceive-model-test-refine process.
The ALGO¨3i screener has an intuitive,
simple, and consistent GUI designed to work integrally with the
deviceÕs six physical button controls, allowing easy navigation
and data entry. The deviceÕs color graphical displays use primary
color-coded shapes bearing simple (green) ÒOK,Ó (red) ÒX,Ó (Blue)
Up/Down Scrolling Chevrons, and (Green) Character Toggle Arrow symbols
which are matched with the same symbols printed on color-coded device
buttons. Together this
simple and universal interactional language easily and flexibly
guides the user through the screening process as well as review
and export of patient records.
In addition to providing an interactive environment thatÕs
both extremely simple to learn and efficient to use, the ALGO¨3iÕs
interactional architecture allows for broad and seamlessly consistent
extension of functionality into the future.
All interactive functionality is symbolically
mapped in the visual display to a few archetypal interactive functions
associated with the following physical buttons:
1) Blue Up and Down Chevron Buttons
- Item scrolling
2) Red "X" Button - Return,
Pause, Cancel, Go Back, etc.
3) Green "OK" Button - Accept,
Open, Forward, etc.
4) Black with Green Arrows 4-Way Directional
Toggle Control - Character position selection (left/right arrows)
and Alphanumeric entry (up/down arrows)
5) Aqua "?" Button - Contextually-sensitive
Help
Because the user interface is based
primarily upon universally-understood symbols, support for additional
languages, as well as new or extended functionality can be added
without affecting, changing, or adding to the ALGO¨3iÕs underlying
interactional architecture.
This is a key advantage to the creation of a flexible and
extensible interactional language for a product.
As the product evolves, it's not necessary to abandon a previous
interface, based solely on a cosmetic look added atop a bolted-together
set of engineered functions, but can truly expand within a familiar
and efficient interface. In the long run, the benefits far outweigh
the extra effort required in the original design to establish an
all-important interactional language.
The device was designed to be usable
in a number of ways, in order to fit the different needs of its
wide target demographic. Unlike
other hand-held screeners, the ALGO¨3i device screens both ears
simultaneously, saving the user time. To further decrease total
screening time, the ALGO¨3i screener also has a unique SpeedScreen
function, which allows the user to conduct a test while entering
patient information. Flexibility in usage, while remaining efficient is an important
part of making a product usable. Many devices, without an interactional language architecture,
fall back on an approach I've named the "Carnival Ride."
In Carnival Ride interfaces, the user is shuttled into a
long linear sequence of screens, each with a specific function or
options. The problem
inherent in many Carnival Ride interfaces is that they are very
inflexible, and are often prone to the confusing phenomenon of spelunking,
as described earlier in this presentation. This can be mitigated with proper visually-provided context
(e.g.: a portion of the display used to note where the user is within
the larger system, along with feedback and feedforward hinting),
but these aids are often left out of poorly designed interfaces. In Carnival Ride interfaces, the user must conform to the interface,
instead of the interface conforming to the user.
The ALGO¨3i software was also includes
a ÒsmartÓ Help system designed to assist with common issues encountered
during the screening process. The Help system may be accessed at
any time by pressing the Help "?" Button, which will take
the user from any window directly into the Help Index and associated
subject related to the present place in the interface.
The user is free to scroll through explanations and graphics,
and easily return to where they were in the operational activity.
The ALGO¨3i screener contains an infrared
(IR) port, which can be used to transfer or print screening results.
These screening results are stored in the Record Management
section of the software, which like all other functions is accessible
from the interface's Home Screen. The Record Management section is organized
in a simple manner, as a chronological listing of screening results
(PASS/REFER), and access to Print, Transfer (IR), and Delete functions. Screening records that have been transferred
are marked with an icon to indicate this. This IR port allows wireless
printing of screening results and wireless transfer of patent screening
data to a PC. The device is compatible with a variety of data management
systems, which allow hospitals to easily track patient results,
monitor the quality of their screening program, and transfer patient
data to state health departments.
The ALGO¨3i provides several language
options for the screening process allowing it to meet the needs
of healthcare providers around the world. Upon first release, the
device incorporated English, Dutch, German, French, Spanish, and
Japanese languages. The screener has the ability to extend to include
even more languages.
The device incorporated a device I
designed and named the SoftClipª, which is a multi-function component
on the back which allows the device to be placed or attached in
a variety of configurations. The SoftClipª allows the screener to
be placed on a flat surface, be mounted onto a rollstand, or attached
to a standard clinical nursery bassinet. The SoftClipÕs unique design
prevents the ALGO¨3i device from falling either into the bassinet
or onto the floor, even when slightly jostled. The device is also
ergonomically designed to fit comfortably into a personÕs hand.
The Natus SoftClipª also serves as a cleat, around which
the device cables can be wrapped and stored for secure management
during storage and transport.
The outside case of the ALGO¨3i screener
is injection-molded CYCOLOY¨ PC/ABS for strength and durability. The SoftClipª is injection-molded CYCOLOY¨
PC/ABS and overmolded with Kraiburg TPE for soft feel and non-slip
qualities. Internally,
the screener is comprised of five printed circuit boards, an eight-color
Hantronix liquid crystal display, three wire harnesses, a rechargeable
and environmentally-friendly lithium-ion battery and various attachment
hardware. Because of the unitÕs small size, the printed circuit
boards required surface mount technology. The device software is
stored in two eeproms. The device allows the user to recharge the
battery via an external power supply. The device communicates with
an external printer via an infrared port. This port is also used
to transfer up to 100 records of patient data to an external computer
for further processing in a single transmission.
The processor inside the ALGO¨3i is
a Texas Instruments TMS320VC33 DSP, running software written from
the ground up. A Xilinx
FPGA is also on the board, which handles some signal preconditioning
as well as glue logic. This low-level software approach allowed
the device to make use of a cost-effective, yet graphically effective
Hantronix LCD. This
strategy was crucial in the creation of a responsive and effective
user experience that was also extremely cost-effective and extensible,
and is another example of the tight integration of design efforts.
This is another thing that sets this device apart from devices
that might utilize a PDA such as a PocketPC as the platform. Such platforms are much more expensive,
and depending on how the software is implemented, could be less
responsive or more difficult to use.
From a development timeline standpoint,
the ALGO¨3i was designed, given FDA approval, and brought to market
in just a little over one yearÕs time.
This was an extremely aggressive schedule, and demanded that
the small team of designers constantly model their designs and test
their assumptions with ad-hoc user groups throughout the development
process. This insured their ability to make rapid
and detailed design and development progress while using important
feedback to drive the design iterations and refine various aspects
of the device and its user interface.
Enthusiastic reviews by doctors, clinicians,
and users around the world, brisk sales, and the MDEA and IDEA design
awards have proven that the design strategy for this product were
greatly successful. It
stands as an embodiment of most, if not all the many lessons I've
learned in my twenty year career as an industrial and interaction
design consultant.
Lesson Learned: It's possible to achieve great success by tightly integrating
industrial design, interaction design, information architecture,
and software design in a holistic, singular design effort.
Thoughts
on my design career and the challenges and opportunities ahead for
Information Architects and Interaction Designers
In today's world, where interactive
functionality and information is increasingly accessible and intertwined
everywhere and across diverse ecologies of devices and systems,
these lessons and strategies are more important than ever.
When I began my design career in the
early 1980s, there was much I expected to happen that didn't, and
much I didn't anticipate happening that did.
I expected that many, in the 1980s (let alone beyond that)
would recognize and understand the deep importance of interaction
design as a new form of architecture.
Instead of the design and configuration of physical bricks
and mortar, or metal and plastic, Interaction Design was to be,
in my vision, an emergent architecture of the real, but relatively
intangible interrelationships of function, affordances, and usage.
By the late 1980s, the success and
spread of the GUI, made famous by the Apple Macintosh and other
subsequent computer platforms, had led to an explosion in interaction
and interaction design alright.
But I could sense that this form of Interaction Design was
expanding primarily within what I'd considered a subset of the larger
user experience realm - media. CD-ROMs and the phenomenon of multimedia
exploded, utilizing the existing talents of pioneering multimedia
designers experienced in early environments such as Apple's HyperCard.
Interactive Games, and multimedia applications
of many sorts also took advantage of the large pool of graphic designers,
a field which had been boosted by the earlier technological phenomenon
of Desktop Publishing.
By 1989 I was beginning to understand
that my own work, and exploration of integrated physical, visual,
and informational systems and associated user experience strategies,
was by no means a mainstream approach.
Most "architectures" in multimedia applications
and device interfaces were, I felt, rather simplistically ordered
in rigid hierarchies or presented in fixed sequential "Carnival
Ride" style experiences.
With the InfoSpace project, I was already
wanting to move past what I saw as the simple embodiments developed
in the 1960s at SRI, Xerox PARC in the 1970s, and embodied in commercial
products in the 1980s. I
was seeking systems that utilized data architectures based on non-centralized
metadata strategies, and simple, but new and powerful interactive
visualizations they made possible.
When I moved to Palo Alto in 1989,
one of the first presentations I made of the InfoSpace concept was
to Apple Computer's Advanced Research and Development Group. I was pleased that the entire group sat in on my presentation
of my ideas. The browser
concept, the integration of it into the operating system's very
desktop as an interactive space, and the powerful model of interactive
data visualizations for files and returned hits (something that
would serve just as well for net-wide searching as for visualization
and access to local files).
After my presentation, the then Director
of Advanced Research thanked me, but told me that Apple didn't think
the Internet was the future of the Macintosh. This was to be the first of many disappointing missed opportunities
to bring what I felt were changes of great magnitude to the user
experience and information architecture fields.
When I first saw the emergence of the
early World Wide Web and the Mosaic browser in 1992, I felt for
certain that the larger ideas and models explored and described
in the InfoSpace project would be recognized and pursued.
This turned out not to be true.
Instead, another thing happened that now makes much more
sense when looking back at the where the greatest number of designers
and interactional architecture was being done when the Web emerged
- the field of CD-ROM-based Multimedia.
Within a few years, by the mid-1990s
this was plainly evident.
Perhaps the cruelest blow that I suffered was when my own
professional organization, representing my home field of industrial
design, IDSA (Industrial Designers Society of America) decided to
split their annual design review in two. One review was for industrial design as
it had always been understood, and a separate (!) review was established
for "interactive media."
I was so disappointed. My entire career vision had been aimed
at the goal of establishing an experience record and body of work
to support the notion of the holistic integration of industrial
design and interaction design.
Instead, interface design was seen as a branch of graphics,
or multimedia (and eventually web design). The numbers were against my vision.
I was outnumbered by an entirely different idea of what interaction
design and interfaces were being perceived as.
Another factor which limited my ability
to communicate my ideas was my position as a self-employed design
consultant. Though
my consultancy sometimes grew to several subcontractors or employees,
I was always tied up in the role of principal designer.
My long hours on complex projects throughout the 1990s precluded
my having the time to stop and write about what I was doing, or
pay to travel and participate in conferences.
There were many attempts I made to forge contacts with researchers
in the fields of data visualization and well-known research groups
such as Interval, but I was always rebuffed or turned away, once
with the admonition that, "we're really sorry we don't have
the bandwidth to work more with local practitioners."
"Local practitioner," was what I was.
I had no Ph.D.. Beyond
a standard education in Design, I was self-taught in the field of
user experience, having forged my own approach. This left me at odds with both the established academics in
the field as well as the economic mainstream of web and media design. The late 1990s were a time of both interesting
projects in my own area of interest, but also increasingly stressed
by shortened time cycles and competition from other designers of
the interactive media type.
In 2001 I joined Pacific Consultants,
a 120-person engineering consultancy in Mountain View, California.
This was a great opportunity for me to leverage my broad
design skills with a wide range of engineering disciplines.
Pacific Consultants had electrical, radio frequency, software,
optical, medical, and mechanical engineering capabilities.
I was able to work very successfully as Director of Industrial
Design and Human Interface there for two and a half years, on projects
as divers as the Army's Land Warrior program and the Natus ALGOª3i
Newborn Hearing Screener.
At Pacific Consultants I learned one of the most important
lessons of all, the power of integrating my design with a broad
team of engineers and technologists.
In 2004 I became Director of Design
and User Experience at PalmSource in Sunnyvale, California. I cannot yet disclose the nature of the
work I did there. It
was also a wonderful experience, however, and I was able to bring
the full range of my experience to the project.
In November 2004 I left PalmSource
to return to consulting and work on a startup that I'm an investor
in and design advisor for.
Though I experienced many successes,
disappointments, and struggles in the 1990s, I have a much better
understanding of how and why the field of interaction design and
information architecture has evolved the way it did.
Now, in 2005 there is a greater need than ever before for
integrated user experience development. The challenges of designing
effective and friendly interfaces for small portable devices are
fundamentally different from those discovered and solved in the
desktop and web worlds.
I frequently see models that are successful
in one type of platform or environment, fail when applied to a small
device. There is still
nowhere near the level of diversity and breadth of design experience
necessary to address the small device space.
The overwhelming majority of interaction designers and information
architects are still working solely in the desktop and web realms.
As I continue to work on a diverse
range of products, systems, and environments I am more motivated
than ever to use my extensive design and development documentation,
developed over many years, as a light to illuminate important regions
of the user experience field that many are not familiar with.
Regions that have been relatively far from the bright lights
of the larger design world and in relative darkness for far too
long.
Bio
James Leftwich, IDSA is founder and
principal of Orbit Interaction, a pioneering and broad-based interaction
design and intellectual property development consultancy located
in Palo Alto, California. Trained as an industrial designer, he
has over 20 years of broad consulting experience in the area of
Human Interface development.
From April 2004 through November 2004
he was Director of Design and User Experience at PalmSource, Inc.
in Sunnyvale, California, where he was involved in a proprietary
project. Prior to joining PalmSource, he was Director
of Human Interface and Industrial Design at PEMSTAR Pacific Consultants
in Mountain View, California. During that time he was the primary
industrial and user experience designer for PPC's varied projects,
including the Natus Algo¬3i Newborn Hearing Screener (winner of
the 2004 Medical Design Excellence Award - MDEA2004).
Before that he was a career-long independent
consultant, the founder and Principal of Orbit Interaction, a pioneering
and broad-based interaction design and intellectual property development
consultancy located in Palo Alto, California. Leftwich's interactional
architecture and design has been part of numerous award-winning
products, ranging from Sun Microsystems SPARCworksª, the acclaimed
and award-winning Nike Triaxª line of runners' watches, Macromedia's
Dreamweaverª web-building software, Kensington's VideoCAMworksª
software, and many others. He has recently been invited to serve
on the Advisory Board for the Asilomar Institute for Information
Architecture.
He holds nine patents in interface
methodologies, including physical devices, visual interface display
methodologies, and information systems.
He received a BFA in Design from the Kansas City Art Institute
in December 1983.