banner
News center
Articulate and proficient in their expertise.

How Google is accelerating ML development

Nov 23, 2023

Join top executives in San Francisco on July 11-12, to hear how leaders are integrating and optimizing AI investments for success. Learn More

Accelerating machine learning (ML) and artificial intelligence (AI) development with optimized performance and cost is a key goal for Google.

Google kicked off its Next 2022 conference this week with a series of announcements about new AI capabilities in its platform, including computer vision as a service with Vertex AI vision and the new OpenXLA open-source ML initiative. In a session at the Next 2022 event, Mikhail Chrestkha, outbound product manager at Google Cloud, discussed additional incremental AI improvements including support for the Nvidia Merlin recommender system framework, AlphaFold batch inference and TabNet support.

[Follow VentureBeat's ongoing Google Cloud Next 2022 coverage »]

Users of the new technology detailed their use cases and experiences during the session.

Transform 2023

Join us in San Francisco on July 11-12, where top executives will share how they have integrated and optimized AI investments for success and avoided common pitfalls.

"Having access to strong AI infrastructure is becoming a competitive advantage to getting the most value from AI," Chrestkha said.

TabNet is a deep tabular data learning approach uses transformer techniques to help improve speed and relevancy.

Chrestkha explained that TabNet is now available in the Google Vertex AI platform, which makes it easier for users to build explainable models at large scale. He noted that Google's implementation of TabNet will automatically select the appropriate feature transformations based on the input data, size of the data and prediction type to get the best results.

TabNet is not a theoretical approach to improving AI predictions; it is an approach that shows positive results in real-world use cases already. Among its early implementers is Uber.

Kai Wang, senior product manager at Uber, explained that a platform his company created called Michelangelo handles 100% of Uber's ML use cases today. Those use cases include ride estimated time of arrival (ETA), UberEats estimated time to delivery (ETD) and rider and driver matching.

The basic idea behind Michelangelo is to provide Uber's ML developers with infrastructure on which models can be deployed. Wang said that Uber is constantly evaluating and integrating third-party components, while selectively investing in key platform areas to build in-house. One of the foundational third-party tools that Uber relies on is Vertex AI, to help support ML training.

Wang noted that Uber has been evaluating TabNet with Uber's real-life use cases. One example use case is UberEat's prep time model, which is used to estimate how long it takes a restaurant to prepare the food after an order is received. Wang emphasized that the prep time model is one of the most critical models in use at UberEats today.

"We compared the TabNet results with the baseline model and the TabNet model demonstrated a big lift in terms of the model performance," Wang said.

Cohere develops platforms that help organizations benefit from the natural language processing (NLP) capabilities that are enabled by large language models (LLMs).

Cohere is also benefiting from Google's AI innovations. Siddhartha Kamalakara, a machine learning engineer at Cohere, explained that his company has built its own proprietary ML training framework called FAX, which is now heavily using Google Cloud's TPUv4 AI accelerator chips. He explained that FAX's job is to consume billions of tokens and train models as small as hundreds of millions of parameters to as large as hundreds of billions.

"TPUv4 pods are some of the most powerful AI supercomputers in the world, and a full V4 pod has 4,096 chips," Kamalakara said. "TPUv4 enables us to train large language models very fast and bring those improvements to customers right away."

VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.

Follow VentureBeat's ongoing Google Cloud Next 2022 coverage » VentureBeat's mission