storage flexibility

Results 1 - 25 of 72Sort Results By: Published Date | Title | Company Name
By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
Organizations are collecting and analyzing increasing amounts of data making it difficult for traditional on-premises solutions for data storage, data management, and analytics to keep pace. Amazon S3 and Amazon Glacier provide an ideal storage solution for data lakes. They provide options such as a breadth and depth of integration with traditional big data analytics tools as well as innovative query-in-place analytics tools that help you eliminate costly and complex extract, transform, and load processes. This guide explains each of these options and provides best practices for building your Amazon S3-based data lake.
Tags : 
     Amazon Web Services
By: Datto Inc.     Published Date: Jan 31, 2013
Business continuity, data insurance, local storage flexibility, cost benefits, standards compliance and infrastructure control. This is just some of what Datto and hybrid cloud backup can do for you, and your clients.
Tags : backup, data recovery, business continuity, channel, hybrid-cloud, bdr, virtualization, small to medium-sized business, smb, downtime, instant virtualization, disaster recovery, managed-service provider, msp, recovery time objective, rto, recovery point objective, rpo, risk assessment, datto
     Datto Inc.
By: VMware     Published Date: Dec 10, 2018
As agencies continue to modernize data center infrastructure to meet evolving mission needs and technologies, they are turning to agile software and cloud solutions. One such solution is hyper-converged infrastructure (HCI), a melding of virtual compute, storage, and networking capabilities supported by commodity hardware. With data and applications growing exponentially along with the need for more storage capacity and flexibility, HCI helps offset the rising demands placed on government IT infrastructure. HCI also provides a foundation for hybrid cloud, helping agencies permanently move applications and workloads into public cloud and away from the data center.
Tags : 
     VMware
By: Red Hat     Published Date: Nov 30, 2015
A step-by-step guide for adopting software-defined storage as part of a practical, proactive strategy for managing enterprise data. Flexibility is inherent in this technology, and now more than ever, a flexible IT architecture might be the key to facing the once unthinkable challenges of big data.
Tags : software-defined storage, enterprise data, it architecture, big data, enterprise applications
     Red Hat
By: NetApp APAC     Published Date: Jul 08, 2019
All-flash storage is on a strong growth trajectory, but the industry is not swooning naively over the blazing fast I/O speeds. Most IT managers are talking a mature, step-by-step approach to all-flash adoption. Falling per-gigabyte prices are making the technology more commonplace. Solid state drives are no longer just for specialized, high-performing tasks. Before they leap, however, storage professionals want to understand the full business picture as they formulate a winning strategy for putting all-flash storage to use for more workloads. This paper offers insights for business success with all-flash storage based on IT Central Station reviews. Real users weigh in on what it takes to get the most out of the technology. It covers such aspects of flash storage as the need for simplicity and the importance of flexibility. The paper also looks at how to build a business case for all-flash and think through the implications of issues such as integration with existing infrastructure.
Tags : 
     NetApp APAC
By: PC World     Published Date: Jul 02, 2012
Eight Must-Have Technologies for the IT Director
Tags : dell, pc world, storage virtualization, thin provisioning, technologies
     PC World
By: NetApp     Published Date: Dec 09, 2014
Although the cost of flash storage solutions continues to fall, on a per-gigabyte capacity basis, it is still significantly more expensive to acquire than traditional hard drives. However, when the cost per gigabyte is examined in terms of TCO, and the customer looks past the pure acquisition cost and accounts for “soft factors” such as prolonging the life of a data center, lower operating costs (for example, power and cooling), increased flexibility and scalability, or the service levels that a flash solution enables, flash solution costs become increasingly competitive with spinning media.
Tags : flash storage, netapp, tco, flash storage solutions, soft factors, flash solution, it management
     NetApp
By: NetApp     Published Date: Aug 14, 2015
Choosing cloud storage consumption model.
Tags : cloud, storage, data, strategy, applications, flexibility
     NetApp
By: Amazon Web Services     Published Date: Oct 09, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyse data that addresses many of these challenges. Data Lakes allow an organization to store all of their data, structured and unstructured, in one, centralized repository.
Tags : cost effective, data storage, data collection, security, compliance, platform, big data, it resources
     Amazon Web Services
By: Oracle     Published Date: Jan 28, 2019
Traditionally, the best practice for mission-critical Oracle Database backup and recovery was to use storage-led, purpose-built backup appliances (PBBAs) such as Data Domain, integrated with RMAN, Oracle’s automated backup and recovery utility. This disk-based backup approach solved two problems: 1) It enabled faster recovery (from disk versus tape) 2) It increased recovery flexibility by storing many more backups online, enabling restoration from that data to recover production databases; and provisioning copies for test/dev. At its core, however, this approach remains a batch process that involves many dozens of complicated steps for backups and even more steps for recovery. Oracle’s Zero Data Loss Recovery Appliance (RA) customers report that total cost of ownership (TCO) and downtime costs (e.g. lost revenue due to database or application downtime) are significantly reduced due to the simplification and, where possible, the automation of the backup and recovery process.
Tags : 
     Oracle
By: Oracle     Published Date: Jan 28, 2019
For more than a decade, Oracle has developed and enhanced its ZFS Storage Appliance, giving its users a formidable unified and enterprise-grade storage offering. The latest release, ZS7-2, boasts upgraded hardware and software and is a timely reminder that more users might do well to evaluate this offering. It has a trifecta of advantages: (1) It’s notable performance, price-performance, and flexibility are all improved in this new release (2) There is a surprisingly inclusive set of functionalities, including excellent storage analytics that were developed even before analytics became a contemporary “must-have” (3) There’s a compelling group of “better together” elements that make ZFS Storage Appliance a particularly attractive choice for both Oracle Database environments and users that want to seamlessly integrate a cloud component into their IT infrastructure. Given the proven abilities of Oracle’s prior models, it’s also safe to assume that the new ZS7-2 will outperform other m
Tags : 
     Oracle
By: Red Hat     Published Date: Mar 14, 2016
The journey toward software-defined storage is different for each IT organization. When choosing a storage solution, it’s important to consider flexibility, availability, and agility.
Tags : 
     Red Hat
By: NetApp     Published Date: Sep 11, 2014
Although the cost of flash storage solutions continues to fall, on a per-gigabyte capacity basis, it is still significantly more expensive to acquire than traditional hard drives. However, when the cost per gigabyte is examined in terms of TCO, and the customer looks past the pure acquisition cost and accounts for “soft factors” such as prolonging the life of a data center, lower operating costs (for example, power and cooling), increased flexibility and scalability, or the service levels that a flash solution enables, flash solution costs become increasingly competitive with spinning media.
Tags : flash technologies, total cost of ownership, flash storage, data center, increased flexibility
     NetApp
By: Druva     Published Date: Sep 27, 2017
Virtualization tools are providing storage pros with the ability to overcome the limitations of physical servers and gain more flexibility. While hypervisors are allowing them to leap over these boundaries, they’re simultaneously crumbling traditional data protection walls. It’s time to rethink how we handle backup. Inside, discover 8 essential real-world practices to put in place for hypervisor backup, archiving, and disaster recovery.
Tags : vmware, virtualization, hypervisors, data protection, archiving, disaster recovery
     Druva
By: IBM APAC     Published Date: Mar 19, 2018
We know that flash storage can provide significant benefits and deliver needed agility for large businesses, but what about medium-sized businesses (MMB = 500 to 999 employees)? IBM asked Forrester Consulting to look into adoption and use of flash storage technologies by mid-market organizations. A snapshot survey of 106 IT professionals found the following: › MMBs are actively adopting flash storage › Flash is exceeding expectations regarding cloud management, flexibility and security › A majority of respondents expect to continue their investments, with almost half aspiring to an all-flash storage environment. To learn more download this report.
Tags : 
     IBM APAC
By: Sponsored by HP and Intel® Xeon® processors     Published Date: May 07, 2012
The Nemertes Research PilotHouse Awards provide insight on the performance of technology vendors. See which vendors were recognized for their servers built for virtualization.
Tags : virtualization, servers, intel, storage, 2.0, infrastructure, trends, storage, flexibility, drivers, market dynamics, trends, converged, custom requirement, deduplication, tiering, protocol, foundation, data management, data center
     Sponsored by HP and Intel® Xeon® processors
By: Red Hat     Published Date: Jul 16, 2012
An introduction to Red Hat Storage Server Architecture
Tags : storage server, red hat, scalability, flexibility, affordability
     Red Hat
By: IBM MidMarket Demand Generation     Published Date: Apr 23, 2012
With real storage virtualization, according to the report, you can improve operational efficiency and drive down costs. And you can do this without sacrificing - or while even improving flexibility - giving you the ability to "continuously migrate" applications as a part of normal business operations
Tags : ibm, technology, storage, virtualization, business technology, it management, data storage
     IBM MidMarket Demand Generation
By: Sponsored by HP and Intel® Xeon® processors     Published Date: May 07, 2012
Discover how the shift from "buy and hold" data center strategy to the next generation of HP ProLiant servers can tackle today's IT constraints and tomorrow's business opportunities in this new IDC dynamic white paper.
Tags : intel, storage, 2.0, infrastructure, trends, storage, flexibility, drivers, market dynamics, trends, converged, custom requirement, deduplication, tiering, protocol, foundation, data management, data center
     Sponsored by HP and Intel® Xeon® processors
By: Red Hat     Published Date: Jun 09, 2014
Read this whitepaper to find out how Red Hat Storage Server can allow enterprises to quickly and confidently deliver business applications that minimize cost, complexity, and risk while increasing architectural flexibility.
Tags : red hat, data explosion, business applications, enterprise architects, mobile, social, data platforms, digital media, it management, data management
     Red Hat
By: Red Hat     Published Date: Jun 10, 2014
Read this whitepaper to find out how Red Hat Storage Server can allow enterprises to quickly and confidently deliver business applications that minimize cost, complexity, and risk while increasing architectural flexibility.
Tags : red hat, data explosion, business applications, enterprise architects, mobile, social, data platforms, digital media, it management, data management
     Red Hat
By: AWS     Published Date: Nov 02, 2017
Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. A data lake is an architectural approach that allows you to store massive amounts of data into a central location, so it’s readily available to be categorized, processed, analyzed, and consumed by diverse groups within an organization. Since data - structured and unstructured - can be stored as-is, there’s no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand.
Tags : 
     AWS
By: HP     Published Date: Aug 03, 2009
Snapfish is the world's No. 1 online photo service, with more than 60 million members in 20 countries and more than 5 billion unique photos stored online. Its previous storage infrastructure was not able to meet Snapfish's existing and future needs. The company was looking for a storage solution that would offer extreme scalability, flexibility, reliability, improve performance, balance performance and capacity with affordability, and meet its business needs for partnering. Read this case study to learn how Snapfish built and has maintained a highly scalable, affordable, high-performance storage architecture that has kept up with customer demand.
Tags : snapfish, nas, storage, bladesystem, scalability, san
     HP
By: Hewlett-Packard     Published Date: May 13, 2008
Learn how SARCOM, a leading IT services provider, used a virtualized HP-based SAN infrastructure linked to server blades, with point-in-time replication (full data set snapclones) and backup to tape libraries to improve storage performance, flexibility, and reliability while reducing costs and protecting data.
Tags : hp data protection, data protection, sarcom, virtualization, storage virtualization, virtual storage, virtual array, eva, storage
     Hewlett-Packard
Start   Previous   1 2 3    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com