Whenever we visit a new city/place, we use 'Google maps'. But have you observed that it asks us to walk 20 meters in the north/south. I mean of course there is a visual help on the screen but what about the people who can't see correctly? And I am the only one who gets confused when she says that such things? Is this a conversational UX problem? I think it would be helpful if with the help of AI it could tell me after which shop am I supposed to take left/right from instead of talking in such weird way? Is this possible ?
While Google Maps and other navigation apps have made efforts to improve accessibility by providing voice-guided directions, there is still room for improvement. Integrating AI technology can enhance the conversational experience and provide more contextually relevant instructions for users.
Here are a few potential solutions to make navigation directions more user-friendly for everyone, including people with visual impairments:
Instead of using cardinal directions (north, south, etc.), the AI could provide instructions based on recognisable landmarks. For example, it could say, "After passing the coffee shop, take a left," or "Turn right at the intersection with the red building." This way, users can navigate based on distinctive features in their surroundings.
AI can incorporate specific street names and numbers in the directions to provide more precise information. Instead of relying solely on cardinal directions, the instructions could be something like, "Turn left onto Main Street" or "Take a right at 5th Avenue."
In addition to providing directions in terms of cardinal directions, the AI could include distance-based instructions. For example, it could say, "After 50 meters, turn right," allowing users to have a better sense of how far they need to walk before making a turn.
The AI could utilise audio description capabilities to provide additional information about the user's surroundings. It could mention specific landmarks, such as parks, statues, or distinctive buildings, to help users orient themselves more effectively.
AI could take into account users' preferences and abilities, allowing them to customise the level of detail and complexity in the directions. This way, users who prefer more specific instructions can receive them, while others may opt for more general guidance.
Implementing these enhancements would require a combination of AI technologies, such as natural language processing, computer vision, and machine learning. It would also involve collecting and analyzing more detailed mapping and location data.
While these improvements may not be available in their entirety at the moment, the ongoing advancements in AI and conversational UX provide promising opportunities for creating more inclusive and accessible navigation experiences in the future.
โ
โ