Rather than an invention of action, the data revolution is a process invention of perception. Data makes connections between technology, changing the way humans see and interact with the world.
Unfortunately, storytelling humans don’t intuitively communicate or understand data, especially in huge quantities. A trained individual, usually a statistician, can successfully interpret and translate small pools of information. However, big data’s volume is in zettabytes, complicating expression. That single individual, or even a large team, can no longer keep up without incurring huge costs or failing to meet demands.
Companies must now rely on methods of cognitive data analytics that involve Artificial Intelligence (AI) tools, such as machine learning as well as natural language generation and processing. Two major advancements in the field of big data cognitive analytics are conversational platforms and intelligent narratives.
Back-up. Why is storytelling important for business and data?
Storytelling is important for business because it creates a sense of trust and investment. A story relays cause and effect in a personal manner. This forms kinship with the consumer, reassuring them that robots are not running the show. A company has two stories: one for employees and one for the consumer audience.
When employees, especially leaders, know their company’s history and current moment, they contribute better to keeping on course or moving toward a better future. Likewise, for consumers: hearing facts with feeling draws them into being a part of the future, even if it means paying for it. On both sides, investment occurs, and the company wins.
Conveying data narratively is essential to building those stories convincingly. Data points without interpretation lack the linguistic markers that most people need to understand urgency or value, especially in decision making. Providing them narratively, and with with visuals, relays significance. Up until now, companies more than less did this. But data narration on the current macro scale is much harder without tools like natural language generation and processing.
Alright. What is natural language generation and processing?
When data is collected, it comes in several forms, like images, electrical signals, numbers, percentages and other measurements: things that are foreign to the average thought process. Natural Language Generation (NLG) and Natural Language Processing (NLP) are AI tools that help put big data into manageable, conversational terms.
Previously, when data was input, there was limited translation that could happen without human interpretation. The output could be graphs, trendlines and lists, but a human operated the programs that produced these. She also had to present the output, explaining the significance of the data. The current enormous volume of data available inhibits real-time analysis, immediate processing and story production. That’s where NLG and NLP come in.
NLG is an AI tool that “writes.” A computer scans input data sets and outputs coherent, understandable language that describes the information. An Excel sheet with 10 thousand entries about margins over time can be read and returned as “Over the last 10 months, margins have risen by 50% with the average margin moving from 40% to 60%.”
NLP is the corollary AI tool that “reads.” A computer scans language and decides or recommends a course based on what it reads. Based on the increased margins of the previous example, an NLP AI might return the suggestion to lower prices or to increase stock of certain products.
NLPs and NLGs are trained using many AI methods, including machine learning. They are used to create conversational platforms and intelligent narratives.
What’s the difference between a conversational platform and an intelligent narrative?
An intelligent narrative is the summation of an enterprise’s data, both traditional and dark, written in conversational form that explains the significance. It does the work of a hundred statisticians and data analysts in less time with more nuance. An NLG reads information from every source available and then delivers a story that can be used to make decisions. An intelligent narrative contextualizes big data within the broader internal story of a business.
In a manufacturing plant, the data may be equipment damages, stock reports and purchasing history. In public safety, it may read geographic data of crime and audio logs from emergency calls. It returns results that management teams can use quickly and effectively.
A conversational platform is an interactive chatting computer that can carry on conversations with humans using recognizable language. Sometimes referred to as chatbots, they engage almost like a human would. Mainstream examples of conversational platforms include Amazon’s Alexa and Microsoft’s Cortana. A conversational platform can be a helpful tool for telling the external story, especially in customer service arenas.
Fido.ai, a vendor for conversational platforms, explains that a chatbot could answer questions from clients immediately based on computational knowledge of client data sets. Imagine a doctor’s office that sets itself up as a patient-first, always available service. By employing a chatbot with access to patient records and broader data, the medical firm could engage with patients 24/7.
What kinds of industries will benefit from these new technologies?
Any industry that relies heavily on data collection and remote monitoring will immediately benefit from these types of tools as they will decrease the lag time between collected data and analytics. Example industries include cold-chain systems, wastewater management and agricultural operations. Manufacturing could also take advantage of big data analytics using NLG and NLP.
Another industry that will benefit is online retail, where stories revolve around knowing customer needs and fulfilling them quickly. Conversational platforms will interface with consumers, in turn generating data for intelligent narratives to produce actionable results that improve shopping experiences.