Real time transcription and car crash detection in action

Published by at

This is pretty cool - yes, a bit of an advert for Google's software on Pixel smartphones, but the functionality will roll out far wider in time. Google has funded two real user stories, each highlighting one aspect of their Pixel software. Firstly helping a deaf dad call his son for the first time ever, and secondly raising the alert after a car crash. They're short and worth a watch.

Google’s new video series is called 'True Pixel Stories' and I'm sure we'll see more videos shortly. Embedded below are the first two, highlighting 'Live Caption' and 'Car Crash Detection'. 

In the first video, 'First call with my son', the Live Caption feature has made it possible for Matthew, who was born deaf, to go on actual phone call with his hearing son Harry. The feature generates subtitles on any audio that is playing on a Pixel device - and that includes phone calls:

Notably, Live Caption works even if offline, so it can be used anywhere.

The second video, 'The crash that called', recounts an accident at a difficult intersection, the Pixel detected the passengers were in a car accident and 'Car Crash Detection' mades the phone vibrate and ring and the user could say "Emergency" or tap the on-screen button, to make an emergency call and share the current location:

All good stuff. Knowing Google, Live Caption will roll out officially to other Android manufacturer handsets in due course, plus the car crash thing will be copied and implemented elsewhere, I'm sure.

In fact, 'live caption' applications already exist on iOS (use that phrase in the App Store) and Android (look for 'live transcribe'). All very useful in accessibility situations.