data storage

Results 1 - 25 of 955Sort Results By: Published Date | Title | Company Name
By: Intel     Published Date: Aug 06, 2014
Designing a large-scale, high-performance data storage system presents significant challenges. This paper describes a step-by-step approach to designing such a system and presents an iterative methodology that applies at both the component level and the system level. A detailed case study using the methodology described to design a Lustre storage system is presented.
Tags : intel, high performance storage
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: IBM     Published Date: Jun 05, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
Learn how GPFS accelerates data intensive work flows and lowers storage costs in Life Sciences, Energy Exploration, Government, Media & Entertainment and Financial Services by removing data related bottlenecks, simplifying data management at scale, empowering global collaboration, managing the full data life cycle cost effectively and ensuring end-to-end data availability, reliability, and integrity.
Tags : ibm, complete storage solution, gpfs
     IBM
By: IBM     Published Date: Sep 02, 2014
This brief webcast will cover the new and enhanced capabilities of Elastic Storage 4.1, including native encryption and secure erase, flash-accelerated performance, network performance monitoring, global data sharing, NFS data migration and more. IBM GPFS (Elastic storage) may be the key to improving your organization's effectiveness and can help define a clear data management strategy for future data growth and support.
Tags : ibm, elastic storage
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: Seagate     Published Date: Sep 30, 2015
Although high-performance computing (HPC) often stands apart from a typical IT infrastructure—it uses highly specialized scale-out compute, networking and storage resources—it shares with mainstream IT the ability to push data center capacity to the breaking point. Much of this is due to data center inefficiencies caused by HPC storage growth. The Seagate® ClusterStor™ approach to scale-out HPC storage can significantly improve data center efficiency. No other vendor solution offers the same advantages.
Tags : 
     Seagate
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what an HPC solution like Lustre can deliver for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre*, solution for business
     Intel
By: IBM     Published Date: Nov 14, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : 
     IBM
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: Dell and Intel®     Published Date: Nov 18, 2015
Unleash the extreme performance and scalability of the Lustre® parallel file system for high performance computing (HPC) workloads, including technical ‘big data’ applications common within today’s enterprises. The Dell Storage for HPC with Intel® Enterprise Edition (EE) for Lustre Solution allows end-users that need the benefits of large–scale, high bandwidth storage to tap the power and scalability of Lustre, with its simplified installation, configuration, and management features that are backed by Dell and Intel®.
Tags : 
     Dell and Intel®
By: Data Direct Networks     Published Date: Dec 31, 2015
Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.
Tags : 
     Data Direct Networks
By: Hewlett Packard Enterprise     Published Date: Apr 20, 2018
In an innovation-powered economy, ideas need to travel at the speed of thought. Yet even as our ability to communicate across companies and time zones grows rapidly, people remain frustrated by downtime and unanticipated delays across the increasingly complex grid of cloud-based infrastructure, data networks, storage systems, and servers that power our work.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: May 04, 2018
Multicloud Storage for Dummies consists of five short chapters that explore the following: - How the multicloud storage model aligns with modern business and IT initiatives - Common barriers to cloud adoption and how a multicloud storage model addresses them - How to build a multicloud data center - What to look for in multicloud storage services - Real-world multicloud use cases
Tags : 
     Hewlett Packard Enterprise
By: Oracle     Published Date: Nov 28, 2017
Today’s leading-edge organizations differentiate themselves through analytics to further their competitive advantage by extracting value from all their data sources. Other companies are looking to become data-driven through the modernization of their data management deployments. These strategies do include challenges, such as the management of large growing volumes of data. Today’s digital world is already creating data at an explosive rate, and the next wave is on the horizon, driven by the emergence of IoT data sources. The physical data warehouses of the past were great for collecting data from across the enterprise for analysis, but the storage and compute resources needed to support them are not able to keep pace with the explosive growth. In addition, the manual cumbersome task of patch, update, upgrade poses risks to data due to human errors. To reduce risks, costs, complexity, and time to value, many organizations are taking their data warehouses to the cloud. Whether hosted lo
Tags : 
     Oracle
By: Cohesity     Published Date: Apr 24, 2018
As organizational needs change and workloads become increasingly distributed, a key realization is emerging: traditional approaches to backup and recovery may no longer be sufficient for many organizations. These companies may have discovered that their existing tools are not keeping pace with other advancements in their computing environments, such as scale-out storage systems and hyperconverged systems, which seek to reduce data center complexity and help manage surging storage costs.
Tags : 
     Cohesity
By: Cohesity     Published Date: May 04, 2018
Cohesity provides the only hyper-converged platform that eliminates the complexity of traditional data protection solutions by unifying your end-to-end data protection infrastructure – including target storage, backup, replication, disaster recovery, and cloud tiering. Cohesity DataPlatform provides scale-out, globally deduped, highly available storage to consolidate all your secondary data, including backups, files, and test / dev copies. Cohesity also provides Cohesity DataProtect, a complete backup and recovery solution fully converged with Cohesity DataPlatform. It simplifies backup infrastructure and eliminates the need to run separate backup software, proxies, media servers, and replication. This paper specifically focuses on the business and technical benefits of Cohesity DataPlatform for the data protection use case. It is intended for IT professionals interested in learning more about Cohesity’s technology differentiation and advantages it offers for data protection - (i) Elim
Tags : 
     Cohesity
By: NetApp     Published Date: Mar 06, 2018
The company’s recently unveiled HCI platform leverages SolidFire’s all-flash scale-out architecture and performance management capabilities to support enterprise environments. Will it help NetApp make up ground against its fast-growing rivals?
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Mar 06, 2018
Gartner Report: Competitive Landscape for Hyperconverged Integrated Systems. Accelerate business outcomes and achieve growth with hyperconverged integrated systems (HCIS). Key findings: As the hyperconverged integrated system (HCIS) market has matured in the past two years, some smaller providers without sufficient bases, technical resources and cash to survive on their own have exited the market." • "HCIS-only providers that have a sustainable sales advantage based on effective communication of technical strengths that can deliver superior cost savings or performance advantages are most likely to remain viable as the market continues to grow." • "HCIS providers without compelling value propositions to differentiate their offerings against other HCIS solutions are unlikely to survive the increasing competitive pressure that is appearing in the market."
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Mar 06, 2018
This Gartner Report on an effective hyperconvergence strategy includes their research findings and recommendations. It also includes a deep dive analysis of key determinants in a hyperconverged solution decision. For more information, please view our privacy policy here.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Mar 06, 2018
This Hyper Converged Infrastructure solution brief describes the key benefits of NetApp's next generation HCI solution including enterprise scale, efficient storage architecture, trustworthy data services, and IT operations transformation.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Mar 06, 2018
Read the Industry Update from Evaluator Group to learn what a transforming NetApp means not just for NetApp, but for a data-driven world.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Mar 06, 2018
This ESG report commissioned by NetApp provides insight to understand converged and hyperconverged platforms.
Tags : netapp, database performance, flash storage, data management, cost challenges
     NetApp
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com