There are several eagerly anticipated tech events over the course of a year that unite analysts, commentators, and geeks across the globe – from the Consumer Electronics Show in Las Vegas to the Mobile World Congress in Barcelona, and Apple Inc.’s (AAPL) new product unveilings.
The Google I/O annual developers’ conference is no exception.
Held last week in Alphabet Inc.’s (GOOGL) hometown of Mountain View, California, the conference featured the usual things that tech fans love.
The company provided updates to Android and Android Wear, new chat apps designed to compete with the likes of Snapchat and Apple’s Facetime, a new generation of “OK, Google” called Google Assistant, which aims to leapfrog Siri’s abilities, and a home assistant designed to compete with Amazon.com Inc.’s (AMZN) Echo. (More to come on the latter in a future issue)
This is all headline-grabbing news for the masses, of course. But what about the “technology within the technology”?
In other words, the stuff that actually powers all these cool devices?
Well, Google had a big surprise…
New Chip Comes With a Big Boast
The company revealed a new computer chip that it designed and custom-built itself.
There had been rumors to this effect – after all, you can’t hire senior chip designers without industry insiders knowing about it – but the progress the team has made was a huge surprise to the industry. The chip has actually been in Google’s server farms for over a year!
The new Tensor Processing Unit was specifically designed for machine-learning and Big Data – exactly the areas that Google is counting on to fuel its future growth.
And the company unveiled the chip accompanied by a big boast: It claims it has more than 10 times the capabilities of any existing chip of comparable power consumption. That’s three generations of Moore’s Law, or nearly seven years.
So what can this chip do?
Less Jargon, More Slaps
We’ll dispense with the technical jargon and keep it simple:
- It allows Google’s StreetView to read road signs more quickly than before.
- It can learn language idioms so Google Assistant will understand people with accents or those who use uncommon phrases.
- It can instantly make sense of the massive amounts of data already on Google’s servers, and will multiply this 100-fold or more as the Internet of Things becomes entrenched in just about every activity imaginable.
Basically, it allows machines to understand humans and their environment, rather than requiring us to understand the machines.
So what does Google’s new chip innovation say to incumbent chipmakers like Intel Corp. (INTC), Advanced Micro Devices, Inc. (AMD), Hewlett Packard Enterprise Company (HPE), NVIDIA Corp. (NVDA), and others?
Well, it’s a bit of a slap in the face.
Google depends on such companies for its servers, and the Tensor chip is a clear signal that Google thinks they’re not innovating quickly enough.
But it’s a huge opportunity, too…
Google Ignites a New Paradigm
For one thing, Google isn’t likely to sell its chips on the market. And its advancement will have other companies clamoring for more advanced chips from the more established chipmakers.
In addition, the Tensor chip doesn’t work alone – it depends on other high-end machine-learning-capable chips to do its job.
It’s also worth noting that custom chips aren’t new when it comes to advanced applications. For example, there are specialized chips for Bitcoin mining, which have virtually taken over that market because of their high speeds and low power consumption.
And extremely advanced users such as the National Security Agency not only design their own chips, they also have their own fabrication facilities to manufacture them.
But what makes Google’s chip different is that it represents an endorsement of Big Data and machine learning and ushers in a new paradigm for advanced chipmakers.
Google Raises the Chip Bar
For several years, it’s been difficult for chipmakers to sell customers on the most advanced processors – after all, home users haven’t needed the most advanced chips for anything besides graphics for games and video editing for years now.
And browsing the internet – by far the most common activity on home computers and increasingly on mobile ones, too – doesn’t require very much computing power.
On the other side of that internet connection, content providers were content to connect hundreds or even thousands of medium-capability servers to duplicate the capabilities of fewer, more powerful servers. Indeed, that’s been preferable in some instances because it allows the companies to place servers closer to end users, increasing speed as perceived by that customer.
But as more applications require the kind of deep intelligence that Google is introducing with its Tensor chip, other companies will have to rethink how they design their processes. More companies will conclude that it’s easier and cheaper in the long run to pay for the most advanced chips.
In that sense, Google has actually done a huge favor to the chipmakers it seems to have insulted.
Research and Development budgets may have to get a slight increase, but the payoffs for that research could be spectacular in the long run.
To living and investing in the future,