Dr. Mazin Gilbert, AT&T Labs’ vice president of advanced technology, is placing his research bets on software defined infrastructure, big data, artificial intelligence and ways to scale and optimize the telecom giant’s network.
But first Gilbert needs some help from the open source community to bring the data sharing that’ll enable many of these advances.
We caught up with Gilbert last week in New York City to talk shop. Here are a few highlights from my conversation with Gilbert:
Interest in quantum computing. AT&T Labs is looking at quantum computing in terms of the network. “We’re not looking to build a general quantum computer. It’s more about intelligence at the edge and exponential processing in the future,” said Gilbert.
Quantum computing is developing, but years away from being mainstream.
Intelligence at the edge. AT&T sees two network edges. One that’s tied to the customer and will be connected by 5G and devices. The other will be farther away from the end customer and be still close, explained Gilbert. Gilbert said white boxes with compute, storage and networking will be programmed for multiple purposes. Many functions will move to the edge because its not economical to move large amounts of data across the network and cloud, he explained.
“We’re moving from the cloud and latency to one where data will follow you and be intelligent,” said Gilbert. The processing for artificial intelligence, augmented reality, virtual reality and 360 video will also have to be closer to the edge.
5G’s role. Gilbert said 5G speed will have a big impact on the Internet of things and connecting to edge processing. “IoT is not about the phone, but billions of devices. A $1 device will have the intelligence of a phone. This fast access will be connected to the edge,” said Gilbert.
Artificial intelligence. The conversation with Gilbert was a bit like walking through building blocks of interest to AT&T Labs to ultimately build to AI and data. “Who owns the most diverse data will lead,” said Gilbert. While technologies like 5G, IoT and even quantum computing are interesting they ultimately become enabling to more AI.
AI Legos not snowflakes. AI has a big issue: AI frameworks don’t fit together well if at all. The AI space is fragmented and data silos don’t mix and match well. AI has to be transparent and open in a way that develops a broad set of tools that can be used by individuals and companies to create new use cases, explained Gilbert. That vision is why AT&T Labs is participating in the open source community vi.
“AI isn’t about snowflakes but Legos that can be put together in a standardized way to build applications,” said Gilbert. And connecting these Legos don’t mean companies give up their data ownership.
In October, AT&T and Tech Mahindra contributed code to The Linux Foundation to create a standard that can make AI apps reusable and easily accessible to any developer.
Data prep. The dirty little secret about AI is that its success resides on data quality and having your own data house in order. “95% of the time AT&T spends on data is on data quality,” said Gilbert. “Data doesn’t come ready for TensorFlow. AI is like an exotic car with no gas.”
AT&T’s approach to data prep and cleanup revolved around the realization that the company couldn’t go back and fix old data. Instead, Gilbert said the company took the main data feeds that were critical to it focused on the top 50.
AT&T has 170 Petabytes of data and millions of data feeds from its network. “AT&T realized a year ago that we needed a data transport platform,” said Gilbert. As a result, AT&T launched Indigo, which is designed to create a secure platform so communities can share data to work on projects of mutual interest. AT&T Network 3.0 Indigo is a trusted environment for data sharing and analytics collaboration.