設計工具
company

Making use of fast data's golden potential

Micron Technology | May 2018

What does an assortment of data look like? When Micron Business Development Manager Eric Caward pictures a collection of data, he envisions a mountain, where each individual piece of dirt and rock represents a piece of information. At first glance, this mountain looks like a mound of big data—potential that may resemble a pile of dirt to some. But savvy miners know that some mountains contain flakes of gold.

In a mountain of data, these flakes represent valuable pieces of information that can be used to gain deeper insights. A collection of home temperature readings might not seem like the most interesting mountain of big data, but the trends inside could prove incredibly useful. If the home tends to overheat during a certain time of day, tracking that trend could help the homeowners better optimize their heating system and save money on energy.

With placer mining—where gold has accumulated in loose material requiring water to be used to extract it—miners began panning to get to the gold flakes. But while panning is an easy technique, it’s not the best way to get gold from large deposits (just like getting the right information from a mountain of big data). That’s why more effective miners moved to sluice boxes and screening plants to process the large deposits more quickly to find their treasure.

How can a computer become a skilled miner and efficiently sift through this mountain, or big data, to reach and identify those flakes of gold? It takes fast data analysis to extract the important pieces of information effectively. That list of temperature readings in the cloud means next to nothing at face value, but if a computer system can scan it, identify the trends therein, and churn out a solution, it’s found the gold. Eureka!

In order to best process this fast data, the computer system needs efficient memory with minimal delay. Just letting those trends stay hidden in the dirt won’t do anyone any good. Luckily, nothing moves data faster within a system than super-fast Dynamic Random Access Memory (DRAM).

chart detailing 3 ways data speed can be improved

Informing Fast Data with Big Data

According to a 2016 article in Entrepreneur, the collection of data is growing at an incredibly fast rate. By 2020, each person online will create 1.7 megabytes of new data every second, adding to the 44 zettabytes of data that will already exist at that time.

With technology’s growth in its ability to monitor the body’s vitals on a daily basis—whether through small wearable devices that detect things like heart rate and sleep patterns, or medical innovations that monitor glucose levels and blood pressure—healthcare organizations can assist with preventative medicine in a revolutionary way. As IoT devices (nonstandard computing devices that connect wireless to a network and can transmit data) surge in popularity and tools to keep track of different elements of a patient’s health continue to multiply, more big data is created every minute.

When an advertising agency determines which sponsored post to place on a user’s social media news feed, it has to sift through all available data to find the relevant pieces to effectively speak to that user. If they get it right, the choice can amount to pure gold to the advertiser.

When an Artificial Intelligence (AI) program takes a look at an individual’s profile, it will see several bits of browsing data—perhaps a combination of Amazon browsing history, YouTube subscription boxes, and Google searches galore. Fast data can string those pieces of information together quickly, locating automotive merchandise in an online shopping cart, documenting previously viewed “how to change 2012 Ford Explorer brake pads” YouTube videos, and registering consistent DIY project articles. In this case, the program could easily churn out an advertisement for a local auto parts store. As AI gets smarter and data gets faster, the data will show that you’ve already purchased the brake pads, so the advertisement might instead focus on lug wrenches and jack stands needed to complete the job you’re likely working on.

“If the program utilizes a very fast memory system to instantly intercept and interpret the data and can push a relevant ad on a website instantaneously, it could result in a click-through and a sale,” Caward says.

In order to intercept that collection of data quickly enough, the units running these AI and machine-learning programs need to have high enough bandwidth to take all of the big data stored in the cloud for a specific social media profile and browser history, identify the important flakes of gold, and analyze it in close proximity to the processing unit. The closer that important information, “what we typically call hot data,” Caward says, can get to the system’s processing unit, the more return value a user will see. This is why Micron is heavily invested in creating faster and more efficient memory solutions.

From Hard Drives to Solid-State Drives and Speeding Things Up

It’s not just the reliable DRAM that speeds up the movement of data within a system. A system can gain precious milliseconds of speed by moving from a traditional hard disk (HDD) to a solid-state drive (SSD). A standard hard drive requires more technical movement to get information, and then has to physically spin in order to read the data, taking precious time.

"When you move to flash (SSD) memory, you're not physically moving anything, so you can access that data much faster." Eric Caward Micron Business Development Manager

fast data chart for healthcare and driving applications

According to Caward, when you move to flash (SSD) memory, “you're not physically moving anything, so you can access that data much faster.”

Processors today are pushing the speed envelope and elevating standard 3- or 4-gigahertz speeds up to 4.5 or even 5 gigahertz. If you’re “processing data at nanoseconds, and if you have to wait not microseconds, but milliseconds, to get your data, your CPU's just not doing anything for that extra time,” according to Caward. In an effort to avoid those wasted fractions of seconds before getting results, memory is actually being placed closer and closer to processing units and designed for high-performance computation in the form of GDDR5, GDDR5X, and GDDR6 memory.

In order to move this data as fast as possible at low latency and high bandwidth, Caward explains, the memory is “actually soldered next to the compute unit.”

Applying Fast Data to Today’s Technology

With quicker memory solutions already available and improving daily, machine learning and AI have endless applications, including what Caward calls this generation’s Holy Grail, the autonomous car. The sensors in these vehicles are constantly monitoring inputs—traffic signals, their positional awareness, proximity to other things (cars and people, especially)—and analyzing what actions to apply to a given situation.

“You're just taking a massive amount of information into, basically, a supercomputer in your car, and you're processing that data,” Caward says. “You're pulling the non-useful data out to make that a little bit more manageable. You'll do some internal processing. You'll connect through various networks to go out to, potentially, the cloud, do even more processing, and then react accordingly—so you get to your destination in a safe manner.”

Fast data works wonders in doctor’s offices, as well. Instead of sending a CAT scan to 3,000 doctors and having them each analyze the cells, a single CAT scan can be submitted to a neural network that has learned how to identify malignant cells by analyzing years’ worth of malignant and non-malignant cells.

“Those detection rates are going to go way up,” Caward says. “Once it's put into the computer and you have the fast data, it can run on auto-pilot.”

Because Micron is advancing products that increase the speed at which data can be processed, it can help to expand these types of applications. Processors are getting faster and will continue to compute more and more information. But if a 26-core processor is only fed enough data to keep one core occupied, the other cores have nothing to do. Big data and fast data have much to offer, but only if they’re used to their fullest potential.

“You’ve got this piece, this silicon chip that processes logic, and it keeps things moving,” Caward says. “But it's moving at such a rate that you’ve got to keep the data transferring to it as quickly as possible. Otherwise, you're wasting its potential.”

By using Micron DRAM and SSDs to move this data to the CPU quickly, this potential won’t go wasted. Systems can sort through the mountains of big data to reveal the flakes of gold data hidden inside, making way for new conclusions and insights.