Minsait, an Indra company, has developed a product that uses artificial intelligence (AI) to significantly reduce the average time used for the classification of documents. Our Onesait Language solution allows unstructured documents to be classified automatically, as well as the relevant information extracted from it, all with the help of Natural Language Processing (NLP) technologies and neural networks. This results in a tangible improvement in business processes and document processing, delivering benefits such as improving productivity and efficiency.
NLP technology allows our solution to automatically understand written human language. In addition, we combine the most traditional Natural Language Processing with the latest techniques based on deep learning using neural networks, which gives the product the capability to constantly learn from its own mistakes. The solution works with an Intelligent Workspace and allows users to manage their workload, review classified documents and monitor the data extracted automatically.
In fact, among its main attributes, Onesait Language makes it possible to extract information in several languages, and has a scalable, flexible architecture that is perfectly integrated with other client systems. From a technical point of view, the system has been trained with legal and real estate documents, and applies an operating model based on SaaS with flexible billing. It also offers an accessible REST API, for the creation of web services, and can be integrated by modules with any of the preexisting systems.
The solution applies to all sectors where massive amounts of unstructured documents written in natural language are managed. For example, Onesait Language is used in real estate and banking with documents such as contracts, deeds, agreements, court sentences, powers of attorney, etc. Data that until now was a challenge to extract and use, can now add value and constitute a starting point for the execution of new operations in an increasingly digital world.
Improved Performance with Optimized Libraries on Intel® Xeon® Scalable Processors
As a member of Intel’s AI Builders Program, Minsait has access to the latest generation of Intel Xeon Scalable processors; this access combined with technical support from Intel engineers, meant we were able to optimize Onesait Language to improve the model training performance, ultimately helping us to reduce costs.
One of the optimizations used to achieve this was the Intel optimization of TensorFlow, Google’s popular deep learning framework. Since 2016, Intel and Google engineers have been working together to optimize TensorFlow performance for deep learning training and inference on Intel Xeon processors using the Intel Math Kernel Library for Deep Neural Networks, a project which has since become part of Intel’s oneAPI Deep Neural Network Library (oneDNN).
At Minsait we look forward to continuing our work with Intel on model optimization on their latest CPU architectures, and ultimately to co-market and continue to offer quality products to our customers worldwide.