Signals Analytics awarded Patent for Unstructured Data Extraction
Signals Analytics has announces a wide-ranging patent grant from the U.S. Patent and Trademark Office for the automatic extraction of information from unstructured data sources. The company says the patent validates the company's approach to enterprise AI and analytics using external data at scale, while easing the deployment and data management processes that are typically so difficult for companies to navigate.
The constantly shifting market landscape means that new business questions emerge on a regular basis and to generate answers and insights that can drive decisions requires new data sources and data types, which need to be integrated into business intelligence systems.
This process requires database structures and analytic models to be updated, which presents major stumbling blocks for enterprises that implement analytics, mainly because typical analytic deployments are not built to scale and adapt.
As Gartner states in a recent research report, Evolving the Capabilities of Analytics and Business Intelligence Platforms, "a data and analytics architecture that can collect new forms of data in a flexible manner increases its ability to respond to the demand from the business for new data sources that can deliver unique business insights."*
When this does not exist, companies will typically either fall back either to older, manual methods of data collection or they make further investments in custom analytics projects and teams, which are cost-prohibitive and take too long to be impactful.
This is important because research shows that companies that do implement analytics successfully are 23 times more likely to acquire new customers and 19 times more likely to be profitable.
"While the need for data and analytics is well-established and budgets grow for these solutions, deploying advanced analytics ends up being very challenging for many companies and many analytic projects end up failing. Under standard frameworks, every time there is a new business question that requires new data sets to find an answer, software engineers have to write new code. This doesn't scale when resources are so limited and market conditions change so quickly like they are now," said Yoram Landau, Vice President of Research and Development at Signals Analytics.
Take for example, a typical scenario in which a company decides they want to analyze trends by gender. They would first need to change the database structure to accommodate a new field. Then they need to write code or create and train models that ensure this new data is ingested correctly into the field and that the system understands the text well enough so only the correct data is transferred.
The code needs to handle a continuous flow, indexing the data and mapping it correctly to the visualization tools. And finally, all this needs to be tested with multiple data feeds to ensure accuracy before it can be deployed in a reliable manner.
With the capabilities of Signals Analytics which were just patented, this process becomes much more streamlined. Now, the data analyst only needs to add gender as a parameter in the visualization tool, and the Signals Analytics system will automatically go back to the beginning of the data flow and self-adjust, adapting the database structure and the taxonomy or existing machine learning models to extract the gender information from the text and pushing the correct information into the visual models, without any further involvement of a