Consider a trip from Times Square to Wall Street.
If you’re at all familiar with the New York City Subway, you know that’s an easy route – no train changes required.
Now, consider a trek from the Bronx Zoo to Flushing Meadows in Queens, where they play the U.S. Open.
If you can figure that out, you’re smarter than most advanced artificial intelligence systems.
According to researchers at Google’s DeepMind AI project, such systems can perform pretty simple tasks like picking out the best Times Square-to-Wall Street route 98.8% of the time. But when it comes to more complex trips, they have a success rate of just 37%.
But now, DeepMind’s big brains say they may have solved that problem with a memory system they’re calling a “differentiable neural computer” (DNC). By doing that, they may have found the key that unlocks the path to truly intelligent AI and deep learning.
Today, we’ll take a look at what DNC is.
And we’ll dig up a company that’s making memory breakthroughs like it possible.
This stock is 40% off its two-year closing high.
It’s going to get back there and higher pretty quickly – and make a nice 25% in just the next year.
The Memory Dilemma
Wall Street has a tendency to focus on advances in processing power that come from semiconductors.
And on the surface that makes a lot of sense. After all, we call it “Silicon Valley” because that’s where silicon-based chips have made exponential increases in computing power for the past 50 years.
We now have semiconductors that contain roughly 15 billon transistors on a chip that’s roughly the size of a postage stamp.
These faster, more powerful chips have fueled huge breakthroughs that have changed virtually aspect of our lives, from smartphones and tablets to cloud computing and Big Data to connected cars and smart homes.
But as important as processing power is, memory is just as much a must-have for our modern digital world.
Many electrical engineers say we were running up against the limits of the ever faster chips that artificial intelligence – the single-most important “enabler” of the Singularity Era – needs because memory just isn’t keeping up. If not solved, the memory challenge would curb AI’s true potential.
Without the amount of memory needed, AI can’t “store” the information it’s already mastered. So AI right now does extremely well with straightforward problems, but when faced with multistep reasoning – like the NYC Subway map – AI fails.
But in late October, Alphabet Inc. (Nasdaq: GOOGL) and its DeepMind team revealed DNC. Based in part on the human brain’s own memory storage system, the team has added a heavy-duty memory module to DeepMind’s already existing deep learning neural network.
Now it can process and then store the info it needs to solve the sort of complex problems that had been perplexing existing AI – and that will unleash the this innovation’s true tech and financial potential. The analysts at Tractica forecast that the global AI market will grow from $643.7 million this year to $36.8 billion by 2025. Don’t expect a 57-fold increase like that without big leaps like DNC.
And memory leaders like Micron Technology Inc. (Nasdaq: MU) are what’s making advances like DNC possible…
A Fine Mesh
Devices like DNC’s memory module store dynamic data to make computers – and AI platforms – run faster and more smoothly. They’re what allow you to open multiple windows on your web browser, work on a document, edit photos, and add graphics to a presentation, all while streaming music in the background.
With this approach, computers store active files in these memory systems so the machines can access the data faster than if they had to retrieve it from hard drives.
In other words, the more advanced the memory, the more robust the entire global network of computers will run.
Micron has pushed the boundaries of computer memories for years – often with stunning results.
Indeed, last year Micron shocked the tech world with a huge breakthrough called 3D XPoint. The new platform is nothing short of whole new way of looking at memory.
Standard flash memory systems rely on transistors to store data. But 3D XPoint (pronounced “cross point”) deploys a microscopic mesh of wires that can be stacked on top of each other. These layers give 3D XPoint its third dimension.
The result is a single system that can handle both memory and storage that performs better than what’s out there today. XPoint “nonvolatile” (NAND) storage, meaning it holds onto its data even when the power is off. And it runs 1,000 times faster than today’s nonvolatile flash memory approach.
In a recent interview with Wired magazine, Patrick Moorhead, president and principal analyst at Moor Insights & Strategy, says XPoint could be a big boost to AI.
“It’s a radical, different design that nobody has,” Moorehead told Wired. “Any artificial intelligence or object recognition you want to have on a device works a lot better with XPoint… The more you can put into that really fast memory space, the better your artificial intelligence is going to be.”
Of course, XPoint could work just as well in other sectors beyond AI, including smartphones, Big Data, cloud computing, and virtual reality.
To put it all in context, just look at what’s up for grabs in Big Data alone. SNS Research says businesses invested $40 billion in Big Data systems this year.
With spending set to rise by 14% a year through the end of this decade the sector will be worth $67.5 billion. If memory just grabbed 2.5% of that total, we’re talking a market worth nearly $1.7 billion.
And that may be too conservative. Grand View Research says next-gen memory such as 3D XPoint will rake in some $3.4 billion every year by 2020 because of its scope throughout the world’s major tech platforms.
Make no mistake. This is a big win for Micron, which already boasts the world’s broadest lineup of memory products…
Why Micron’s Going to Soar
Founded in Boise, Idaho, in 1978, Micron now has more than 20,000 patents and operates in more than 20 countries around the world.
And the breakthrough comes at a great time for the firm while it’s in the midst rising prices for dynamic random access memory (DRAM).
Since early 2014, prices for DRAM devices have gone from $1 per gigabyte to a low of 30 cents, and now back up to 50 cents. Micron’s fortunes have followed suit.
It’s fallen from a high of $36.49, in December 2014, to $9.56 back in May. Now it’s made a comeback to around $20.
You may think there’s not much upside left, but the analysts at DRAMeXchange and elsewhere see the demand for DRAM, and therefore its cost, to not only persist but also grow exponentially. That’s because memory-needing technologies such as AI and VR are reaching larger and larger markets. (The need for NAND memory is also following this trend.)
Micron’s share price should, again, follow suit.
Moreover, Micron has invested for long-term growth through new products and a series of bolt-on buyouts.
Back in 2011, Micron unveiled the Hybrid Memory Cube (HMC). The HMC is a module that runs up to 15 times faster than most of the devices on the market today. Not only that, but the HMC uses up to 70% less energy and takes up about 90% less space.
Two years later, Micron bought Japan’s Elpida for $2.5 billion in a move that lowered costs and boosted output. Last year, it bought a remaining 67% stake in DRAM producer Inotera for $4.1 billion. Those moves helped Micron grow larger than its biggest rival, Samsung Electronics Co. Ltd. (OTC: SSNLF).
Even better for us, Micron’s new 3D XPoint gives us a “twofer” investment. That’s because to make the new product it joined forces with Intel Corp. (Nasdaq: INTC), the world’s largest chip firm.
The two companies haven’t said when they will begin selling the new memory system other than it should debut later this year. Nor have the given us details on they will split the sales between them.
But this much is clear. Micron is working with a mega-cap chip firm that has remained at the sector’s forefront for decades.
Like I said, the stock is on the mend. Since it hit bottom on Feb. 11, Micron is up 100%. But it’s got plenty left in it.
If it just regains its former high, we’ll see profits of 85%. And I believe we’ll make 25% in the next year.
Now, a few years hence, when “pocket” AI systems start finding the shortest path from Bronx to Queens, you won’t be surprised – in fact, you’ll be able to brag to your friends that you predicted it.
Better yet, you’ll also be well on your way to using tech investments to fulfill your retirement dreams – and that’s an even more satisfying feeling.
Strategic Tech Investor Special Report: On Aug. 1, the Stock Market Achieved Singularity…