H2O brings AI grandmaster-powered NLP to the enterprise | ZDNet
There are about 1200 chess grandmasters in the world, and only 250 AI grandmasters. In chess, as in AI, grandmaster is an accolade reserved for the top tier of professional players. In AI, this accolade is given out by the top-performing data scientists in Kaggle’s progression system.
H2O.ai, the AI Cloud company which raised $100 million in a Series E round at the end of 2021, and which is now valued at $1.6 billion, employs 10% of the world’s AI grandmasters. The company just announced H2O Hydrogen Torch, a product aiming to bring AI grand mastery for image, video, and natural language processing (NLP) to the enterprise.
We connected with H2O CEO and Founder Sri Ambati, and we discussed everything from H2O’s origins and overall offering to Hydrogen Torch and where it fits into the AI landscape.
H2O: A stack for AI
Ambati first started working with AI doing voice-to-text translation for the Indian space research program some decades ago. He subsequently stumbled upon neural networks, which were at an early stage at the time. As an immigrant in Silicon Valley, he spent time working in startups. He also spent time on sabbaticals between Berkeley and Stanford and met mathematicians, physicists, and computer scientists.
Working with them, Ambati laid the groundwork for what would become H2O’s open source foundation. But it wasn’t until his mother got breast cancer that he was “inspired to democratize machine learning for everyone.”
Ambati set out to bring AI to the fingertips of every physician or data scientist solving problems of value for society, as he put it. To do that, he went on to add, math and analytics at scale had to be reinvented. That led to H2O, bringing together compiler engineers, systems engineers, mathematicians, data scientists, and grandmasters, to make it easy to build models of high value and high accuracy, very fast.
There is a whole product line built by H2O over the years to materialize this. When H2O started in 2012, Ambati said, there was a gap in scalable open source AI foundations. There were languages like R and Python that allowed people to build models, but they were very slow or brittle or not fully featured. H2O’s contribution, per Ambati, was that they built “the world’s fastest distance calculator.”
This is a reference to the core math used for matrix multiplication in deep learning. When you can calculate the distance between two long tensors, Ambati went on to add, you can start producing rich, linear, and nonlinear math across high dimensional and low dimensional data.
That contribution is part of the H2O open source framework. Ambati calls this low-level foundation “the assembly language for AI.” Then H2O integrated frameworks and open source communities such as Scikit-learn, XGBoost, Google’s TensorFlow, or Facebook’s PyTorch. The H2O team started contributing to those, while eventually putting together an integrated framework in what would come to be known as AutoML.
H2O’s products in that space are H2O AutoML, based on H2O open source and XGBoost, and a broader offering called Driverless AI which is closed source. Both target time series data, which are the backbone of many enterprise use cases such as churn prediction, fraud prevention, or credit scoring.
Driverless AI has been “the engine of H2O economy” as per Ambati over the last four years. It helped H2O acquire hundreds of customers, counting over half of the Fortune 500, including AT&T, Citi, Capital One, GlaxoSmithKline, Hitachi, Kaiser Permanente, Procter & Gamble, PayPal, PwC, Reckitt, Unilever, and Walgreens.
Ambati calls this layer “the compilers of AI.” This is where H2O started utilizing the grandmaster approach: dividing the problem space into a lot of recipes, assigning Kaggle grandmasters to each recipe, with the goal of distilling their knowledge to make things easier for teams on the ground.
The next phase after building a good machine learning model is safely operating this model. Data inherently has bias, and biased models should not go to production unchallenged. Finding blind spots and doing adversarial testing and model validation, deploying models, and then integrating it to the CI/CD of software building is what Ambati calls “the middleware for AI”.
This is addressed with a hybrid cloud, on-premises, and edge offering by H2O – the AI cloud. Customers use it through applications: there is an AI app store, a pre-built model store, and features stores, crystallizing the insights coming out of the model building. The AI Cloud is also multi-cloud, as customers want choice. Then there is also H2O Wave — an SDK for building applications, as per Ambati.
Standing on the shoulders of web giants
Hydrogen Torch, the latest addition to H2O’s portfolio, is tailored specifically to applications for image, video, and NLP processing use cases, including identifying or classifying objects, analyzing sentiment, or finding relevant information in a text. It’s a no-code offering, for which Ambati said:
“It walks into the traditional space of web giants like Google, Microsoft, Amazon, and Facebook, and uses some of their innovation, but challenges them by allowing customers to use deep learning more easily, both taking pre-built models and transforming them for local use.”
Ambati referred to some early adopter use cases for Hydrogen Torch, such as video processing in real-time. In Singapore, this is done to identify whether traffic has picked up, or whether certain situations may result in accidents. The approach used is to take “traditional,” big machine learning models and then fine-tune them to the specific data at hand.
Hydrogen Torch uses Facebook’s PyTorch and Google’s Google’s TensorFlow under the hood. H2O takes them and adds grandmaster expertise, plus an integrated environment. That also includes H2O’s MLOps offering, which feeds off the data and machine learning pipelines going to production.
Models are being continuously monitored to identify whether their accuracy has changed. That can happen because the pattern of incoming data has changed, or because the behavior of end-users has changed. Either way, the model is then rebuilt and redeployed.
In addition, part of the Hydrogen Torch no-code offering is automated documentation generation, so that data scientists can drill down to explore what data was picked and what transformations were applied. Ambati claimed Hydrogen Torch model accuracy can be up to 30% better compared to baseline models, reaching the high 90 percentiles.
Of course, he went on to add, there is a well-known tradeoff in AI between accuracy, speed, and explainability. Depending on the use case requirements, choices have to be made. Speed, however, is somewhat of a universal requirement.
As far as speed is concerned, H2O’s in-memory processing plays a key role in ensuring Hydrogen Torch can perform as needed for image, video and NLP processing use cases. On a related front, H2O also has machine learning model miniaturization on its agenda. That will enable models to be deployed on more devices at the edge, and also have better performance.
Hydrogen Torch also has synergies with another product in H2O’s portfolio, namely Document AI. Document AI enables processing incoming documents, combining image and NLP methods. And then there’s audio and video data, from sources such as Zoom calls and podcasts are proliferating, and H2O aims to help its customers keep up.
H2O has ongoing collaborations with high-profile customers, such as CommBank and AT&T. Experts from H2O and client organizations co-create machine learning models, and there is a revenue sharing scheme in place.
Ambati also identified more areas for future growth in H2O’s portfolio: Federated AI, content creation, synthetic data generation, data storytelling, and even areas such as data journalism are on H2O’s radar. The goal, Ambati said, is building trust in AI to serve communities. That is a grand vision indeed, for which progress is hard to measure. As far as product roadmap goes, however, H2O seems to be on the right track.
For all the latest Technology News Click Here
For the latest news and updates, follow us on Google News.