Home / News & Analysis / Amazon launches an Alexa Auto SDK to bring its voice assistant to more cars

Amazon launches an Alexa Auto SDK to bring its voice assistant to more cars

Amazon this morning announced the launch of a toolkit for developers that will allow them to integrate Alexa into cars’ infotainment systems. The “Alexa Auto SDK” is available now on GitHub, and includes all the core Alexa functions like streaming media, smart home controls, weather reports, and support for Alexa’s tens of thousands of third-party skills. It will also add new features just for auto users, like navigation and search, Amazon says.

The source code and function libraries will be in C++ and Java, allowing the vehicles to process audio inputs and triggers, then connect with the Alexa service, and handle the Alexa interactions.

In addition, Amazon is offering a variety of sample apps, build scripts, and documentation supporting Android and QNX operating systems on ARM and x86 processor architectures.

The SDK will allow for streaming media from Amazon Music, iHeartRadio, and Audible, for the time being, and will allow customers to place calls by saying the contact’s name or phone number. These will be launched over the native calling service in the vehicle.

Plus, it can tap into a native turn-by-turn navigation system, when customers specify an address or point of interest, or if they cancel the navigation.

A local search feature lets customers search for restaurants, movie theaters, grocery stores, hotels, and other business, and navigate to the location.

This is not the first time Alexa has come to cars, by any means. Amazon has been working with car makers like Ford, BMW, SEAT, Lexus and Toyota, who have been integrating the voice assistant into select vehicles. Alexa is also available in older cars through a variety of add-on devices, like those from Anker, Muse (Speak Music), Garmin, and Logitech, for example.

With this SDK, Amazon is opening the voice assistant to other developers building for auto, who don’t yet have a relationship with Amazon.

Read more

Check Also

Twitter tests out ‘annotations’ in Moments

Twitter is trying out a small new change to Moments that would provide contextual information within its curated stories. Spotted by Twitter user @kwatt and confirmed by a number of Twitter product team members, the little snippets appear sandwiched between tweets in a Moment. Annotations adding context https://t.co/ks6TUw8uYF — Gasca (@gasca) October 18, 2018 Called “annotations” — not to be confused with Twitter’s metadata annotations of yore — the morsels of info aim to clarify and provide context for the tweets that comprise Twitter’s curated trending content. According to the product team, they are authored by Twitter’s curation group. In our testing, annotations only appear on the mobile app and not on the same Moments on desktop. So far we’ve seen them on a story about the NFL, one about Moviepass and another about staffing changes in the White House. While it’s a tiny feature tweak, annotations are another sign that Twitter is exploring ways to infuse its platform with value and veracity in the face of what so far appears to be an intractable misinformation crisis.

Leave a Reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

Disclaimer: Trading in bitcoins or other digital currencies carries a high level of risk and can result in the total loss of the invested capital. theonlinetech.org does not provide investment advice, but only reflects its own opinion. Please ensure that if you trade or invest in bitcoins or other digital currencies (for example, investing in cloud mining services) you fully understand the risks involved! Please also note that some external links are affiliate links.