In the five years since Apple introduced iPhone owners to the mobile personal assistant called Siri, we humans have grown much more comfortable using our own words to take command of devices. In this age of big data, analytics, and constant surveillance, it’s the least that we can do to push for balance in the decades-old tug of war between artificial intelligence – and its ability to steal our jobs, drive our cars, and become our machine overlords – and “intelligent assistance,” which gives us humans more effective ways to leverage our own intelligence.
(In John Markoff’s just-published business bestseller on the AI versus IA dichotomy, Machines of Loving Grace, he refers to intelligent assistance as “intelligence augmentation.”)

Since Apple acquired Siri back in 2010, Siri has been joined by an ever-growing cohort of speech-enabled personal assistants from Amazon (Alexa), Microsoft (Cortana), Nuance (Nina), and an unnamed resource that awakens when someone says “Okay Google.”
Collectively, these
tech giants have been instrumental in defining what Opus Research refers to as the “conversational technologies” depicted in the “Intelligent Assistance Landscape” below:
Above: Click on image for a closer view.
(You can access an interactive version of this landscape on VB Profiles.)
Automated voice processing and human-like text-to-speech give personality to applications and services by supporting spoken conversations. The roster of related technology includes visual avatars as well as resources that make the most of “non-verbal” input, including text input, touch, gesture, facial expression, and other attributes that can authenticate an individual’s identity, take stock of one’s emotional state, and perceive other “tells” that reveal intent.
The top level of the landscape is rounded out by “intelligent assistance technologies.” These are the real game changers when it comes to a system or service’s ability to derive meaning from spoken or keyed-in words. IBM Watson is the personification of this type of processing, which Big Blue calls “cognitive computing.” Two years ago it formed a multi-billion-dollar business unit and created an “instant ecosystem” of business partners, developers, and customers for a fast-growing set of API-based resources, starting with Q&A but quickly adding such resources as “personality insights,” sentiment analysis, a natural language classifier, and a couple dozen others, which are available through BlueMix, IBM’s developer cloud.
Below the technologies, “applications” are organized into categories that reflect patterns of usage. “Mobile and personal assistants” are software agents like Siri that take a “horizontal” approach to the digital world. Individuals perceive them as generalists that become familiar with their personal attributes, preferences, and, ultimately, intents. Personal advisors act as subject matter experts whose familiarity with such topic areas as travel & entertainment options, medical advice, or financial planning make them vital to the conversation. I’ll get to the virtual agents and customer assistants sector in some detail below.
Employee assistants refers to the use of both conversational and intelligent assistance technologies to simplify and speed up business activities. They are the tools of the trade for everything from scheduling meetings and routing deliveries to product lifecycle management and R&D.
The growing library of IBM Watson APIs has encouraged over 70,000 developers around the world to get acquainted with the capabilities. This is having a domino effect that results in exponential growth in usage. In a briefing to analysts in late September, IBM provided the following snapshot of Watson’s “Instant Ecosystem”: Over 10,000 apps were in test, controlled experiment, or production mode, and the ecosystem had 350 business partners across 17 industries operating in 36 different countries. Collectively these apps gave rise to a peak rate of 3 billion “API calls” on a monthly basis.
Yet Big Blue and the giants of IT are not the only players on the pitch. As the chart below illustrates, we see nearly 70 companies, mostly newcomers, defining the space. They employ over 20,000 individuals and account for nearly $4 billion in market value.
What’s more, the need for a community of solutions providers has been growing steadily as new firms move to make mobile digital commerce more conversational at home, in cars, and at work. As illustrated below, the number of new firms entering the market has been following a near-exponential progression since 1995.
New IA companies founded
Above: The Intelligent Assistance space is blossoming. Source: Opus Research Ebook on VBProfiles — http://pages.vbprofiles.com/IntelligenceAssistance
The Heat Map below shows that technologies that support natural language processing, machine learning, and semantic search are gaining the lion’s share of investment ($70 million). Elements of the “smart user interface,” referred to as conversational technologies, collectively reach around $76 million. Given the co-dependence of these two areas, Opus Research expects investment to grow in lockstep.
Investment Heatwave Map
Above: Source: Opus Research Ebook on VBProfiles — http://pages.vbprofiles.com/IntelligenceAssistance

Snapshot of the Virtual Agent and Customer Assistant Sector: Redefining Self-Service

The Intelligent Assistant Landscape has a special category for IAs that are put to work in enterprise settings. These include a subcategory of “customer assistants.” In August, in a report entitled “Decision-Makers’ Guide to Enterprise Intelligent Assistants,” Opus Research cited 13 firms whose platforms offer human-like, automated services as a natural user interface for customer care, self-service, and sales.
We believe that this cohort, which represents about two thirds of the total enterprise intelligence assistant market, generated roughly $200 million in revenue in 2014 and that top line revenue is growing in the 30 percent range to exceed $1 billion in annual spending in 2020 in North America.
Above: Enterprise Spending on IA Tech and Services (North America). Source: Opus Research (2015)
This curve is a very conservative market assessment. Thanks to their ability to “learn” and add domain expertise, IAs in the enterprise are evolving to perform more complex tasks on behalf of individuals while, all the time, collecting data and learning how to do a better job the next time. By quickly recognizing each individual’s intent, they shorten the time it takes for people to carry out their everyday tasks and are becoming the go-to resource every time an individual initiates a search, browses an e-commerce web site, or carries out other forms of digital commerce. That’s why we expect that, within the next three years, enterprise IAs will be the primary point of contact to support real world commerce in the digital realm.