Although, as a society, we appear to be learning that the optimal configuration blends (hu)man and machine, the essential tasks of analyzing, interpreting and perhaps most of all, communicating what we learn may soon be done quite competently by the devices and their programs.
This follows the natural progression of technology adaptation which has seen successive waves of expertise-based silos in telephony, electricity and yes, IT, eventually subsumed by technological democratization. JL
Bernard Marr comments in Forbes:
Machine learning algorithms analyze data and identify patterns, interpret data and produce reports and data visualizations.Unless we can communicate (data), they are a waste of time and money. Natural Language Processing (NLP) can break the barriers to analytics by teaching computers the spoken language of humans – eliminating the barrier between man and machine. The data scientist may unneeded where lay persons can conduct their own analytics at will.
The job of data scientist — the quintessential big data job, and the job that was just voted the best job in America for 2016 — is at risk.
Data scientists have been called “unicorns” because finding the right person with the right set of skills — including coding, statistics, machine learning, database management, visualization techniques, and industry-specific knowledge — could be practically impossible. But machine learning and big data itself may be making those unicorns as obsolete as they are mythical.
New machine learning algorithms can autonomously analyze data and identify patterns, even interpret the data and produce reports and data visualizations.
You (and your computer) can be your own data scientist
While most people can see how certain information would be useful and what sort of insights might be derived from it, most lack the technical skills to perform the analytics. They might not have the computers that are able to carry out the large volume of calculations quickly enough to take action, but more often they lack the analytical skills to tell that computer what to do.
Natural Language Processing (NLP) technologies can help to break down the barriers to widespread use of data analytics by making complex analytics possible to just about anyone, regardless of their technical ability. In essence, NLP is teaching computers to accept input in the natural, spoken language of humans – eliminating the communications barrier between man and machine.
IBM IBM +0.93%, for example, believes that it can offer a solution to the skills shortage in big data by cutting out the data scientists entirely and replacing (or supplementing) them with its Watson natural language analytics platform.
IBM’s Vice President for Watson Analytics and business intelligence, Marc Altshuller, explains “With a cognitive system like Watson you just bring your question – or if you don’t have a question you just upload your data and Watson can look at it and infer what you might want to know.
“A traditional data scientist might receive training in R or SAS or whatever tool their school uses, but we found in the ‘citizen analyst’ area, they were often being given the wrong tools where they were required to guess the right answer, and then test their guess.”
I believe that Watson, and other NLP or cognitive technologies, will play an important role in the future of analytics and the education around it. As the value of data analytics becomes apparent in all fields of activity, a growing number of people will want to be able to extract insights from their data. They might not want to take three or four years out to learn advanced computer science and statistics, and with the advances in cognitive computing that won’t be necessary. All that is required might be a brief introduction to NLP technologies.
Gartner IT -0.11% forecasts that the need for so-called ‘citizen data scientists’ — people who are in job roles that are not primarily about analytics but who could benefit from using data-driven insights — is going to grow five times faster than the need for highly skilled data science specialists. And it is these ‘citizen analysts’ that IBM is hoping to attract to working with Watson.
Visualizations at the click of a button
In addition, new technologies are emerging that will allow lay people in any field to create detailed infographics and other storytelling devices to help interpret the data NLP technologies will return.
Visualizations are usually used as a layer on the top of data, designed to make the data more digestible.
In big data analytics, reporting the insights we’ve gleaned from analyzing large amounts of messy data sets is the crucial “last step” of the process – and it’s often a step which causes us to stumble. We may have crunched terabytes of data in real time to come up with our world changing revelations. But unless we can communicate them convincingly to those who need to take action, they are useless, and worse than that, a waste of valuable time and money.
This is why data analysts have come to rely increasingly on graphics and visualizations combined with text – such as the now ubiquitous “Infographics” – to get a message across. But infographics rarely tell the whole story, and are still generally issued alongside written reports or summaries, particularly if they have a corporate purpose and detail is required. Again, this takes time and effort.
Programs that can visualize data start with the graphing functions available in Excel and get progressively more complex. But one program, called Quill, takes the trend a step further, producing text-based reports that explain the data clearly and concisely. Think of it as an executive summary created by a computer to explain a set of data. At the click of a button.
Combined, these types of technologies mean that the human interface — the data scientist — may soon be as mythical as that unicorn, and simply unneeded in the big data landscape where lay persons can conduct their own analytics at will.
0 comments:
Post a Comment