UI Design10 minute read

Designing for Tomorrow: Addressing UI Challenges in Emerging Interfaces (With Infographic)

Seasoned design experts offer insights into the latest UI trends—and provide tips for overcoming design challenges for emerging interfaces like augmented and virtual reality, voice and gesture control, and haptic feedback.


Toptalauthors are vetted experts in their fields and write on topics in which they have demonstrated experience. All of our content is peer reviewed and validated by Toptal experts in the same field.

Seasoned design experts offer insights into the latest UI trends—and provide tips for overcoming design challenges for emerging interfaces like augmented and virtual reality, voice and gesture control, and haptic feedback.


Toptalauthors are vetted experts in their fields and write on topics in which they have demonstrated experience. All of our content is peer reviewed and validated by Toptal experts in the same field.
Christopher Holloway

Christopher Holloway

Senior Writer

Christopher is a Senior Writer at Toptal with 13 years of experience researching and writing about how technology transforms sectors like banking, business, manufacturing, and healthcare. He was previously the business and technology editor at AméricaEconomíca and has developed content for Sony Pictures, Johnson & Johnson, Mini, and LATAM Airlines.

Featured Experts

Featured expert avatar

Previously at Amazon

Adam is a product designer who specializes in digital, mobile, augmented reality, and user experience. His clients include Nike, Disney, Amazon, Activision, GE, Shell, NBA, Coca-Cola, Snap, and the International Olympic Committee.
Featured expert avatar

Previously at Electronic Arts

Edward is a UX designer who worked on award-winning projects for Google, Electronic Arts, and Sega. In 2014, he launched a company to provide strategic gamification and UX services to nongaming companies seeking to leverage AR/VR technology.
Featured expert avatar

Previously at Barclays

Clarke is a designer who focuses on UX, UI, VR/AR animation and 3D design. He has more than 20 years of experience crafting immersive user experiences for companies such as Barclays and Skype.
Share

Screens were the major focus of UI designers for decades, but that’s changing as technology evolves and expands into more corners of daily life. More than 123 million US adults are expected to use a voice-controlled digital assistant in 2023. Tactile feedback is omnipresent in wearables and is a growing trend in medical applications. Virtual and augmented realities are transforming so quickly as to require whole new set of design best practices. And the Internet of Things is trying to merge all of these interactive experiences into a single, massive, almost invisible user interface that some call ambient intelligence.

As technology advances, so do methods of engaging with virtual and physical surroundings, which poses unique challenges for designers tasked with crafting intuitive user experiences.

Future UI design will focus on emerging interfaces like augmented reality, virtual reality, haptic feedback, voice control, gesture control, and wearable tech.
The future of interfaces calls for a shift in how designers think about the user experience.

This exciting age for UI design is comparable to the revolution that touchscreens ignited in 2007 with the launch of the iPhone—only this paradigm is much more complex. And just like skeuomorphism had to walk before material design could run, designing interfaces for these emerging technologies will require new blueprints and much experimentation. According to three leading design experts in the Toptal network, the moment for designers and companies to start trekking this road is now.

AR and VR: Think Outside the Rectangle

Immersive reality has come a long way since Google Cardboard opened the door to an accessible but rudimentary virtual reality experience. Now immersive reality integrates many new interfaces and design possibilities: haptic feedback, augmented reality (AR), virtual reality (VR), advanced sound design, and voice controls are some of the features available on a range of commercially available devices today.

The first modern approaches to immersive reality were, in most cases, the equivalent of looking at an app up close, says Toptal designer Adam Kyle Wilson. Users were mostly spectators on a low-resolution roller coaster, and designing a UI for VR was similar to creating the interface of a regular mobile app.

That’s no longer good enough—taking full advantage of the extended reality (XR) medium requires a paradigm shift, says Edward Moore, a Toptal UX designer and game developer who has worked on VR games and experiences for Google and Sony. “It’s easy to make floating rectangles in virtual environments,” Moore says, but over-relying on rectangles—the most popular shape used in interfaces—means you’re not taking full advantage of the immersive experience. “You have to think three-dimensionally. You need to ask: How do I interact with my actual reality?”

A critical element in UI design for immersive environments is the need to think like an industrial designer, with the human body at the center of the experience. What’s the length that an arm has to extend to grab something? When standing, what should the average height of an object be so users can interact with it intuitively? The human body is the primary input mechanism for AR/VR, and hardware is evolving quickly to refine how it interacts with digital worlds.

For example, Toptal designer Clarke Noone recently worked on a groundbreaking medical application of AR for knee replacement surgery. Due to technology constraints, there was a limited precedent for designing immersive interfaces with the precision needed for performing this delicate surgery, says Noone. “Knee surgery is not an easy task. You only need to be a millimeter out for the surgery to go wrong very quickly.”

So Noone and his team looked to gaming to find the most advanced interactive technologies. They used Unity, a leading game engine, to create the application for HoloLens, one of today’s most advanced AR devices. The app enables a surgeon to scan a patient’s knee to make an ultra-high-quality 3D mold for the replacement. It also helps determine where to make incisions to avoid major nerves, arteries, and cartilage.

With the HoloLens, this medical application overlays the surgeron’s view with information that can help them choose where to make incisions, for example.
A Figma UI mock-up of the surgeon’s view through the HoloLens headset. Courtesy of Clarke Noone.

As advanced as Unity and HoloLens are, achieving medical-scale submillimetric accuracy required a great deal of experimentation and iteration. “We needed extra care in the design and approval process. Everything was triple-checked against the requirements. It was not easy, but we fine-tuned our process and made it work for us.” The VR surgery application cuts recovery time for knee replacements drastically, and reduces costs. As of 2023, the makers are preparing to file for FDA clearance.

Rules regarding designing for VR aren’t set in stone—yet. Wilson, Moore, and Noone recommend that designers check the documentation available and experiment with different tools and environments as soon as possible: Microsoft has a comprehensive design guide for the HoloLens—its flagship mixed-reality device. Apple dedicated an entire section of its Human Interface Guidelines to AR (ahead of the launch of its VR/AR headset), and Meta offers extensive documentation to design for the Meta Quest.

Beyond Displays: Voice, Gesture, and Touch

The Internet of Things (IoT) is a growing network of physical devices, vehicles, buildings, and infrastructure connected to the internet and equipped with sensors to gather and exchange information. A significant number of IoT devices are small, screenless machines with minimal functionality, such as the ability to detect a change in temperature and relay that change to another (probably larger) machine that triggers an action.

Voice and touch are the main ways to interact with screenless devices, and these interfaces come with their own set of challenges. For example, screenless voice interfaces that rely on spoken language (e.g., Alexa or Siri) are more prone to misinterpretations than visual interfaces. On the other hand, a device with no speakers or screens can only communicate through other devices—for example, a smart lock that users manage through an app on their phones—or by using haptic feedback, which, in most cases, is not very intuitive.

Besides applying UI design principles to these voice and touch interfaces, Wilson, who’s been working with consumer-oriented IoT products since 2013, recommends exploring and researching experiential design. Experiential design is all about conveying information through the thoughtful design of environments, such as retail stores, public spaces, or exhibitions. “What you are trying to build for your users is occurring in a physical space, not on the screen, so it takes a bit of a shift in thinking,” Wilson says. He recommends, for example, that UI experts designing for screenless devices or immersive experiences examine how an audio tour guides visitors through a gallery.

Similarly, Noone is working on an AR music production project that involves a virtual 360° wall of mobile interfaces. It relies heavily on hand tracking because it’s being designed for the Apple Vision Pro, which doesn’t use controllers. The lack of controllers created a major new challenge: no haptic feedback between the user and the virtual world.

“It required some thinking around how we transition haptic vibrations to visual cues,” Noone says. “For example, when previously touching a button, the controller would elicit a nice rumble, so we had to add hover states and a slight visual echo around the button to help the user know that they can interact with it just by using their hands.” He adds that for a different app, designers might use audio feedback such as a faint clicking sound to aid the user, but because this project was music-based, any audio cues would be drowned out.

This AR music app has a floating 360° wall of music production tools that the user can interact with.
A modular synthesizer in a forthcoming AR music app. Courtesy of Clarke Noone.

Integration between physical and digital worlds will continue to increase in complexity as technology advances. Context-aware devices—from smartphones and tablets to smart thermometers—can understand and respond to their environments using sensors, software, and algorithms. A map reorienting to a user’s current location is an example of context awareness. On a larger scale, a context-aware device can detect a person’s presence via motion and infrared sensors when they arrive home or enter a room. These sensors, when connected to home devices such as Alexa or HomeKit, can automatically trigger actions like adjusting a thermostat. Hand-tracking devices like the Leap Motion Controller enable users to use gestures to make commands.

Designers can help shape these interfaces, making them more intuitive—and more widely used—over time.

The Tools (and the Robots) Are Already Here

Designers don’t need to wait for an official guide to get started on projects like these. Wilson advises designers to invest their time and energy in these emerging technologies now, rather than remaining focused on web and mobile design. “It’s getting easier and easier to design a high-quality experience on mobile and web because of the maturity of the platforms. It’s a fiercely competitive and saturated market. If you are just starting to study design, by the time your career is complete, mobile and web probably won’t exist anymore.”

Designers looking to get started in the emerging interfaces space can learn the principles of designing for smart environments, find a suitable sandbox, and experiment and collaborate with other designers. Most XR experiences use industry-standard video game engines such as Unreal or Unity, for which the tech and designer community constantly creates new guides and tools. You can even find plugins to work with haptic and voice interfaces. “Just dive in and get comfortable with the editor interface within the engine of your choice,” says Noone. “Once comfortable, you will find that each engine’s UI design section shares many top-level features with Figma, Sketch, and Photoshop—even going so far as to let you set up design libraries, components, and smart objects just like your standard 2D design apps.”

Contrary to what you might think, you don’t have to know how to code to design for these emerging interfaces. Knowing programming can aid design decisions—but it isn’t a requirement, says Moore, who was a programmer before becoming a UX designer.

The Future of UI Design

With technological advancements will come challenges that innovative designers and companies will rise to meet. UI designers need to get comfortable working beyond the screen. They should start exploring and experimenting with new UIs now if they haven’t started already. And organizations should ensure that their teams are equipped with the latest UI tools, resources, and skills to meet future challenges so they don’t miss out on opportunities to advance their products.

AR, VR, voice, gesture, and touch capabilities will create more paths for improving our entwining digital and physical lives—if designers and companies embrace immersive interfaces and the expansive possibilities they generate. The principles of interaction design, product design, and experiential design—along with a healthy dose of experimentation—will pave the way.

• • •

This infographic illustrates some of the primary design considerations for AR/VR, voice, and haptic interfaces. By taking these factors into account, designers can ensure that emerging interfaces are intuitive and meet users’ needs.

0426_Designing-for-Tomorrow-Addressing-UI-Challenges-in-Emerging-Interfaces.png
The future of interfaces calls for a shift in how designers think about the user experience.

Download a PDF version of this infographic.

Understanding the basics

  • What is the future of UI design?

    The future of UI design will utilize AR, VR, voice, gesture, and touch features to provide more intuitive user experiences. Designers should get comfortable with working beyond screens and start experimenting with the latest technology.

  • Is AR and VR the future?

    Augmented reality, virtual reality, and sensory features such as voice, gesture, and touch create new possibilities for blending digital and physical experiences. To take full advantage of these technologies, designers need to expand their knowledge and skills. For example, designing for AR and VR requires a three-dimensional perspective, centering the human body as the primary input mechanism for interaction.

  • What is the future of IoT?

    The Internet of Things (IoT) is merging various technologies and experiences into a user interface that some call ambient intelligence. In this way, IoT focuses on users’ environments and predicting their wants and needs. What does a user experience when walking into a busy store? How will context-aware devices, such as smart thermometers, react when a user enters a room? Designers can explore experiential design to prepare for crafting optimal IoT experiences.

Hire a Toptal expert on this topic.
Hire Now

World-class articles, delivered weekly.

Subscription implies consent to our privacy policy

World-class articles, delivered weekly.

Subscription implies consent to our privacy policy

Join the Toptal® community.