Speech Recognition API in iOS

By Jayesh Kawli

Elevator Pitch

This talk will focus on one such area of machine learning - Speech recognition. I will demonstrate the live demo an iOS APIs and how it can be used to improve app experience. The talk will also focus on user experience aspect of using this API in terms of localization support and privacy protection.

Description

Speech Recognition API is part of Speech framework which is also used by the Siri to perform periodic tasks through speech recognition activity. This API was made available to developers by Apple in iOS 10 for better user experience. It supports over 50 languages and dialects and comes equipped with state of the art accuracy which needs to extra work on the app side.

Developers can use speech recognition in their application as a dictation tool which requires no typing and saves a lot of users’ time. This offers convenience and easier access to app features through spoken commands and improves the user engagement and application experience. Speech recognition on iOS is available in over 50 languages and dialects. Developers don’t need to do extra work for localization and can be used with existing app supported in multiple geos and localizations which extends the app capability manifolds. Example use cases for this API are language translation, quick action through spoken words (Go to “abc”, do “xyz” etc.) and performing a search in the app with keywords.

Application development and design go hand in hand and they support each other on many fronts. When we talk about great UX and app feedback, this is where the Speech Recognition feature comes into the picture. As a UX designer, you will learn how you can apply your expertise to enhance powerful and intuitive app experience with this speech recognition equipped app. This talk will provide you with brief introductions of best practices for iOS user privacy and how to make the immersive app experience. We will learn about how an app should change the states based on the user speech activity, selected locales, and letting users full control of app by providing frequent visual cues. This will also give UX designers vital clues while designing other parts of the app which touch user information and are sensitive to privacy considerations.

I have been using this API in my side project and also made a successful demo integrating it into Wayfair app receiving positive feedback from designers, project managers and fellow developers for future addition to our shopping app. In this talk, I will be sharing all these experiences and lessons learned in the process.

Notes

N/A