© Quantum Corporation. All rights reserved. | Page Feedback
An award-winning developer of next-generation electronics for active safety systems and autonomous vehicle technology needed a solution for a workflow bottleneck caused by an explosion of data and a more complex workflow. Only Quantum’s multi-tiered storage solution could get the development team back to full speed again, and within budget—enabling a simpler workflow by providing greater performance than alternative solutions, and at 10% of the cost.
The international organization in this case study is a leader in creating electronics for next-generation vehicles, including automated safety features and self-driving technology, and has annual revenues in the tens of billions of dollars. Automotive electronics research has undergone rapid evolution in the last few years as the benefits of active safety systems, like automatic braking and collision warning systems, are proving to reduce accidents and make cars safer. The company recently saw its progress in this important program threatened when data volumes dramatically increased, colliding with the need for faster testing and more sophisticated analysis.
Expanded Data Sets for Automotive Research Drive Higher Data Volumes
Developing advanced automotive electronics depends on running sophisticated test and analytic programs on data sets collected from sensors and video cameras mounted on vehicles. As the number of cameras and sensors increased, and resolution rates became higher, the result was rapidly expanding data sets that needed to be stored and managed. The company went from storing a few hundred terabytes of data for its projects to more than a petabyte.
Sophisticated Analytics and Longer Data Retention Change Workflow
As the amount of data exploded, the deployment of high-performance computing (HPC) systems further strained the IT infrastructure, forcing changes in the workflow. The network-attached storage (NAS) systems weren’t fast enough for the new HPC systems, so data sets were copied to the local disk for engineering testing. Larger data sets meant that the teams also needed faster storage performance to run their analytics, and the current environment was not able to keep up. Compounding the problem was the need to save all the old data sets to enable reuse in the future as the team developed new, more advanced analysis software.
The IT team decided to transform its data management and storage systems to support the new demands of the business. It needed to create a system that could store more data, keep it for longer, protect it more effectively, and improve accessibility to all the teams working on these projects—and do it within budget.
Solution Combines High-performance Disk and Tape Archive under Single Point of Management
The IT team looked at all the available options and talked to a wide range of vendors. It considered expanding its NAS capacity, but rejected that approach due to a combination of high costs, data protection complexity, and inadequate performance. Instead, the company selected a tiered storage approach that uses Quantum’s StorNext data management software in a storage architecture that combines high-performance disk with an archive tier.
There were three major advantages offered by the StorNext solution. StorNext was designed to work with very large files and with large numbers of files—factors that were becoming more and more important to the company’s work. StorNext was also designed to enable collaborative workflows—teams can share data sets and even allow engineers to work on the same files concurrently. And finally, it provided a tiered approach that included a tape archive to support massive storage capabilities, automated protection, and lower total costs. Overall, the teams concluded that the overall costs with the StorNext solution could be as low as 10% of the cost of other approaches that they considered.
New Workflow Combines Access, Protection, and Archive
In the new workflow, research data is downloaded into high-performance disk arrays, which are now part of a Quantum StorNext–managed storage environment. The engineering teams’ HPC systems can access the data directly using high-speed Fibre Channel storage area network (SAN) connections. With StorNext, now all the users can see the same data sets, and it can even allow different teams to run analytics on the same files at the same time. Overall, it has made the workflow much faster and more efficient.
Protection has also become an automated part of the process as well. As soon as data lands on disk, StorNext automatically makes two copies in a StorNext AEL tape archive. The copies serve as a backup initially, and later one becomes part of an active archive. When the data on disk is no longer actively being worked on, the disk space is reclaimed, but one of the tape copies remains in the archive where it can be accessed. StorNext presents copies in the archive through the same file system as the copies on disk—in the same directory location—so they are directly accessible to the research teams.
The engineers working on the project have been impressed with the amount of data they can access directly and also at the speed at which the files are written back to disk from the archive for use.
Supporting Collaboration Between Sites
The second copy of data on tape has two uses. It provides an insurance, off-site copy for disaster recovery (DR) protection, but it has also become a very effective way for design and engineering teams in different parts of the world to work on the same data. The team has discovered that the easiest way to share the petabytes of data between North American and European sites is to ship the archive tapes between locations. Data at the second site allows worldwide development teams to share the same data sets, and it also provides a DR copy of these crucial files.
The future needs in this rapidly expanding field of engineering and design are far from clear, but it seems certain that it will involve the need for more data, longer retention, higher performance, and more collaboration. The StorNext approach provides the IT team with a wide range of options to help it meet future needs. Those include allowing the team to scale performance and capacity separately—and it allows the team to let the archive grow huge while keeping the active disk work area as small as possible to control costs.