User experience non-design – it’s not just tech devices – consider the stove’s cockpit.’ Studying the screen plus button choices on a new Microwave, one wonders who tested this interface? Did they really think that the combinations were self-explanatory and intuitive? Or is the convention of poor design so inherent in microwave, oven, and washing machine interfaces, that a ‘cockpit’ design is expected (both by the vendor and the user). Of course, a cockpit is an appropriate term – imagine a pilot sitting down in the left seat of an airplane with zero training on what to touch first.
Mull over the car – which also has a ‘cockpit’ approach which requires training. The history of tech user interfaces is instructive – leading to the ‘less (information) is more’ approach, no doubt stressful for novices. Then there are cars, most requiring training. There are some ridiculous designs, but car dashboards that are intended to be used by everyone are only somewhat better – the 2 screens on the Honda Prologue. Then there’s the BMW Instrument Cluster issue forum or the ‘innovation’ from Mercedes to put a third screen in the car. Note the guide to Toyota warning lights – and decoding the Chevrolet dashboard. Some countries have enforced tough laws about using a cellphone and driving.
Consider the user interface of televisions. The reason there are eas(ier) to use TV remote control devices is a result of the evolution of complexity in the device, first invented by Robert Adler in 1956, introduced as Zenith’s Space Command. After that, TV remote controls deteriorated into a complexity nightmare of multiple purpose-built devices, followed by the introduction of the poorly-named ‘universal’ remote, one that universally requires some programming. Having a smart TV plus sound bar, streaming media player or variants, may have shrunk the number of devices in the basket. For now. Grabbing the wrong device, tapping the highly sensitive wrong button and trying to figure out how to back up – that’s viewing life.
Voice control – does that fix everything? With the appropriate setup on a selected platform, it is feasible to have a mostly hands-free life with devices in the home – lights, temperature, speaker volume and more. If the setup is through a smart assistant, speaking (or yelling) instructions works. Gesture control is available in some cars and perhaps available in the future home automation. Do consumers want that? Would it be more useful than speaking, tapping, typing and button-pressing?
[Another in the series of posts preceding an upcoming report about user experience.]