The intelligence of machines and the branch of computer science which aims to create it

Artificial Intelligence Journal

Subscribe to Artificial Intelligence Journal: eMailAlertsEmail Alerts newslettersWeekly Newsletters
Get Artificial Intelligence Journal: homepageHomepage mobileMobile rssRSS facebookFacebook twitterTwitter linkedinLinkedIn


Artificial Intelligence Authors: Liz McMillan, Elizabeth White, Pat Romanski, William Schmarzo, Ed Featherston

Related Topics: Artificial Intelligence Journal, Mobile Enterprise Application Platforms, Internet of Things Journal

Article

How Is Apple Using Machine Learning? | @ThingsExpo #AI #ML #DL #DX #IoT

Today, machine learning is found in almost every product and service by Apple

Today, machine learning is found in almost every product and service by Apple. They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you.

The concept of AI (Artificial Intelligence) has been the subject of many discussions lately. According to some predictions, AI will have the ability to learn by itself, outclassing the capabilities of the human brain, and even manage to fight for equal rights by the year 2100. Even though these are (still) just speculations and predictions, companies like Apple are developing and implementing machine learning technology, which is still in its infancy. How is Apple using machine learning?

Apple's beginnings with deep learning technologies
Let's start with Apple's beginnings with using AI. It was during the 1990s, when the company was using certain machine learning techniques in its products with handwriting recognition. This machine learning techniques were, of course, much more primitive.

Today, machine learning is found in almost every product and service by Apple. They use deep learning to extend battery life between charges on their devices and detect fraud on the Apple store, recognize the locations and faces in your photos, and help Apple choose news stories for you. Machine learning determines whether the owners of Apple Watch cloud are really exercising or just perambulating. It figures out whether you'd be better off switching to the cell network due to a weak Wi-Fi signal.

Apple's smart assistant
In 2011, Apple integrated a smart assistant into its operating system, and was the first tech giant to pull it off. The name of that smart assistant is Siri, and it was an adaptation of a standalone app that Apple had purchased (along with the app's developing team). Siri had ‘exploded', with ecstatic initial reviews. However, over the next few years, users wanted to see Apple deal with Siri's shortcomings. Thus, Siri got a ‘brain transplant' in 2014.

Siri's voice recognition was moved to a neural-net based system. The system began leveraging machine learning techniques, including DNN (deep neural networks), long short-term memory units, convolutional neural networks, n-grams, and gate recurrent units. Siri was operational with deep learning, while it still looked the same.

Every iPhone user has come across Apple's AI, for example, when you swipe on your device screen to get a shortlist of all the apps that you're most likely to open next, or when it identifies a caller who's not memorized in your contact list. Whenever a map location pops out for the accommodation you've reserved, or when you get reminded of an appointment that you forgot to put into your calendar. Apple's neural-network trained system watches as you type, detecting items and key events like appointments, contacts, and flight information. The information is not collected by the company, but stays on your iPhone and in cloud-based storage backups - the information is filtered so it can't be inferred. All this is made possible by Apple's adoption of neural nets and deep learning.

During this year's WWDC, Apple presented how machine learning is used by a new Siri-powered watch face to customize its content in real-time, including news, traffic information, reminders, upcoming meetings, etc., when they are supposed to be most relevant.

Making mobile AI faster with new machine learning API
Apple wants to make the AI on your iPhone as powerful and fast as possible. A week ago, the company unveiled a new machine learning API, named Core ML. The most important benefit of Core ML will be faster responsiveness of the AI when executing on the Apple Watch, iPad, and iPhone. What would this cover? Well, everything from face recognition to text analysis, with an effect of a wide range of apps.

The essential machine learning tools that the new Core ML will support include neural networks (deep, convolutional, and recurrent), tree ensembles, and linear models. As for privacy, the data that's used for improving user experience won't leave the users' tablets and phones.

The announcement of making AI work better on mobile devices became an industry-wide trend, meaning that other companies might be trying that as well. As for Apple, it's clear that deep learning technology has changed their products. However, it's not clear whether it's changing the company itself. Apple carefully controls the user experience, with everything being precisely coded and pre-designed. However, engineers must take a step back (when using machine learning) and let the software discover solutions by itself. Will machine learning systems have a hand in product design, if Apple manages to adjust to the modern reality?

More Stories By Nate Vickery

Nate M. Vickery is a business consultant from Sydney, Australia. He has a degree in marketing and almost a decade of experience in company management through latest technology trends. Nate is also the editor-in-chief at bizzmarkblog.com.