The Evolution of UI Design
It can be broken down into four periods: 1.the age of tools
2.the age of the machine
3.the age of software
4.the age of the self
1.
Early humans used primitive tools to communicate through drawings of animals and nature on stone. Hieroglyphs were on of the very first methods of communication. This later developed into art, writing, documentation and story-telling. Over time, the tools improved and became more sophisticated leading to the universally used and recognised tool of a pen. While the methods have modernised, we are still using simple symbols and iconography (emojis) to communicate.
2.
Productivity was improved as a result of the industrial revolution. During ‘the age of the machine’, we built objects at scale in an attempt to make out live easier. At this time, the user interface was still the hardware. An example of this is the typewriter. It was at this point we began physically tapping keys to create words as a quicker alternative to pens which gave a consistent and clean finish. These types of machines were being mass-produced, the only drawback being you had to learn how to type to use it.
3.
To be able to design a user interface which the software needed, UI designers looked at people’s behaviours and earlier hardware designs for inspiration. Thanks to the typewriter, people had developed a mental mode of a keyboard. The next step was to begin to interact with text on screen, like the paper and typewriter. Mini-keyboards on on-screen keypads followed the same process, just smaller. As more touch screen options were developed, a new way of interaction was defined.
UI’s design evolution has been influenced by common analogies, preceding hardware and intuition. This resulted in the introduction of skeuomorphism which makes UI elements appear 3D, on a two dimensional screen. This was an attempt to help users understand how to interact with the interface. They would be designed to resemble they physical counterparts. Apple famously led this trend while owned by Steve Jobs. However, this slowly evolved into a ‘flatter’ style as Jonathan Ive gained more influence within the company.
By 2013, the world was ready to use less literal cues and began to appreciate the more minimalistic interface. Designers were encouraged to shift from gradients and drop shadows and to focus more on the content, placing the UI in a supporting role. Google also moved to a different representation of the third dimension by using subtle layers to create depth.
People became familiar with a range of gestures like pinch, tap and pinching in/out, as touchscreen smartphones became a normal part of society. This works because it’s intuitive, when people see it, they know what to do. We are born to explore with out fingers. Touchscreen UIs work best on big screens, which is shown by how small wearable devices make screen interactions very difficult. If the interfaces became too complicated or even too simple, they weren’t effective, “A user interface is like a joke. If you have to explain it, it’s not good”.
4.
The last step of the evolution shows how everything becomes full circle. The Apple Pencil is an example as it is a combination of both hardware and software technology. It is human-centric as it combines two familiar things: a pencil and a tablet. Humans have been using this method since the stone age, carving into wet clay tablets. Designers must base products around innate behaviours, rather than forcing people to learn a new skill.
In comes voice user interfaces. Your voice is a powerful way of interacting with technology, along with the benefit of it being hands-free. Hands-free technology is becoming more and more prominent. In terms of the ambient world, Google’s Project Soli, uses a miniature radar for tracking the motion of a hand. It enables touch-less gestures and makes the interactions feel physical.
The user interface has developed dramatically. In some cases, the designs went a bit too far, but then referred back to the roots of simplicity. It is constantly changing and as the UI develops further it might become more about the experience, rather than the UI, similar to how the UI moved from the hardware to the software.