data capacity

Results 1 - 25 of 194Sort Results By: Published Date | Title | Company Name
By: IBM     Published Date: Sep 16, 2015
Are you trying to support more variable workloads than your environment can handle? Can you benefit from a high performance cluster, but do not have the budget or resources to deploy and manage technical computing infrastructure? Are you running out of data center space but still need to grow your compute capacity? If you want answers to any of these questions, then please join us for an informative webinar describing the advantages and pitfalls of relocating a high performance workload to the cloud. View this webinar to learn: - Why general purpose clouds are insufficient for technical computing analytics and Hadoop workloads; - How high performance clouds can improve your profitability and give you a competitive edge; - How to ensure that your cloud environment is secure; - How to evaluate which applications are suitable for a hybrid or public cloud environment; - How to get started and choose a service provider.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Are you trying to support more variable workloads than your environment can handle? Can you benefit from a high performance cluster, but do not have the budget or resources to deploy and manage technical computing infrastructure? Are you running out of data center space but still need to grow your compute capacity? If you want answers to any of these questions, then please join us for an informative webinar describing the advantages and pitfalls of relocating a high performance workload to the cloud. View this webinar to learn: - Why general purpose clouds are insufficient for technical computing analytics and Hadoop workloads; - How high performance clouds can improve your profitability and give you a competitive edge; - How to ensure that your cloud environment is secure; - How to evaluate which applications are suitable for a hybrid or public cloud environment; - How to get started and choose a service provider.
Tags : 
     IBM
By: Seagate     Published Date: Sep 30, 2015
Although high-performance computing (HPC) often stands apart from a typical IT infrastructure—it uses highly specialized scale-out compute, networking and storage resources—it shares with mainstream IT the ability to push data center capacity to the breaking point. Much of this is due to data center inefficiencies caused by HPC storage growth. The Seagate® ClusterStor™ approach to scale-out HPC storage can significantly improve data center efficiency. No other vendor solution offers the same advantages.
Tags : 
     Seagate
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: Infinidat EMEA     Published Date: May 14, 2019
Digital transformation is a business enabler, one that also translates to an increase in the demand for greater storage capacity and performance. This increasing demand requires IT organizations to re-examine their data storage strategy as the growth in capacity doesn’t align with a shrinking IT budget. To support the growth of the business and remain competitive in a global digital market, CIOs are asked more than ever to “do more with less,” while improving performance and availability at the same time.
Tags : 
     Infinidat EMEA
By: Infinidat EMEA     Published Date: May 14, 2019
Infinidat® enterprise storage solutions are based upon the unique and patented Infinidat storage architecture—A fully abstracted set of Software-Defined Storage (SDS) functions integrated with the best-of-breed off-the-shelf commodity hardware. Infinidat’s software-focused architecture, an evolution and revolution in data management design over 30 years in the making, solves the conflicting requirements of bigger, faster, and less expensive. This paper discusses the technology that enables Infinidat to be the only enterprise storage provider that achieves multi-petabyte capacity with faster than all-flash performance (over 1.3M IOPS at microsecond latency) and an unprecedented seven-nines availability, all at the lowest Total Cost of Ownership (TCO).
Tags : 
     Infinidat EMEA
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: Upsite Technologies     Published Date: Sep 18, 2013
The average computer room today has cooling capacity that is nearly four times the IT heat load. Using data from 45 sites reviewed by Upsite Technologies, this white paper will show how you can calculate, benchmark, interpret, and benefit from a simple and practical metric called the Cooling Capacity Factor (CCF). Calculating the CCF is the quickest and easiest way to determine cooling infrastructure utilization and potential gains to be realized by AFM improvements.
Tags : ccf, upsite technologies, cooling capacity factor, energy costs, cooling, metrics, practical, benchmark
     Upsite Technologies
By: Spectrum Enterprise     Published Date: Oct 29, 2018
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively. When you buy an Internet connection from Spectrum Enterprise, you’re buying a pipe between your office and the Internet with a set capacity, whether it is 25 Mbps, 10 Gbps, or any increment in between. However, the bandwidth we provide does not tell the whole story; it is the throughput of the entire system that matters. Throughput is affected by obstacles, overhead and latency, meaning the throughput of the system will never equal the bandwidth of your Internet connection. The good news is that an Internet connection from Spectrum Enterprise is engineered to ensure you receive the capacity you purchase; we proactively monitor your bandwidth to ensure problems are dealt with promptly, and we are your advocates across the Internet w
Tags : 
     Spectrum Enterprise
By: Hitachi Vantara     Published Date: Mar 08, 2019
Today, data center managers are being asked to do more than ever before: Bring on more applications at a faster pace, add more capacity to existing applications and deliver more performance.
Tags : 
     Hitachi Vantara
By: Hewlett Packard Enterprise     Published Date: May 11, 2018
Very little data is available on how effectively enterprises are managing private cloud deployments in the real world. Are they doing so efficiently, or are they facing challenges in areas such as performance, TCO and capacity? Hewlett Packard Enterprise commissioned 451 Research to explore these issues through a survey of IT decision-makers and data from the Cloud Price Index.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: May 11, 2018
If your business is like most, you are grappling with data storage. In an annual Frost & Sullivan survey of IT decision-makers, storage growth has been listed among top data center challenges for the past five years.2 With businesses collecting, replicating, and storing exponentially more data than ever before, simply acquiring sufficient storage capacity is a problem. Even more challenging is that businesses expect more from their stored data. Data is now recognized as a precious corporate asset and competitive differentiator: spawning new business models, new revenue streams, greater intelligence, streamlined operations, and lower costs. Booming market trends such as Internet of Things and Big Data analytics are generating new opportunities faster than IT organizations can prepare for them.
Tags : 
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: May 10, 2019
Nimble Secondary Flash array represents a new type of data storage, designed to maximize both capacity and performance. By adding high-performance flash storage to a capacity-optimized architecture, it provides a unique backup platform that lets you put your backup data to work. Nimble Secondary Flash array uses flash performance to provide both near-instant backup and recovery from any primary storage system. It is a single device for backup, disaster recovery, and even local archiving. By using flash, you can accomplish real work such as dev/test, QA, and analytics. Deep integration with Veeam’s leading backup software simplifies data lifecycle management and provides a path to cloud archiving.
Tags : 
     Hewlett Packard Enterprise
By: Spectrum Enterprise     Published Date: Mar 01, 2019
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively.
Tags : 
     Spectrum Enterprise
By: Dell EMC     Published Date: Nov 08, 2016
Time-to-market, consolidation, and complexity struggles are a thing of the past. Join yourpeers in database storage nirvana with the Dell EMC All-Flash portfolio, powered by Intel® Xeon® processors.
Tags : database, consolidation, capacity, storage, complexity
     Dell EMC
By: Oracle CX     Published Date: Oct 19, 2017
The Software in Silicon design of the SPARC M7 processor, and the recently announced SPARC S7 processor, implement memory access validation directly into the processor so that you can protect application data that resides in memory. It also includes on-chip Data Analytics Accelerator (DAX) engines that are specifically designed to accelerate analytic functions. The DAX engines make in-memory databases and applications run much faster, plus they significantly increase usable memory capacity by allowing compressed databases to be stored in memory without a performance penalty. The following Software in Silicon technologies are implemented in the SPARC S7 and M7 processors: Note: Security in Silicon encompasses both Silicon Secured Memory and cryptographic instruction acceleration, whereas SQL in Silicon includes In-Memory Query Acceleration and In-Line Decompression. Silicon Secured Memory is the first-ever end-to-end implementation of memory-access validation done in hardware. It
Tags : 
     Oracle CX
By: Dell and Nutanix     Published Date: Jan 16, 2018
Because many SQL Server implementations are running on virtual machines already, the use of a hyperconverged appliance is a logical choice. The Dell EMC XC Series with Nutanix software delivers high performance and low Opex for both OLTP and analytical database applications. For those moving from SQL Server 2005 to SQL Server 2016, this hyperconverged solution provides particularly significant benefits.
Tags : data, security, add capacity, infrastructure, networking, virtualization, dell
     Dell and Nutanix
By: Digital Realty     Published Date: Feb 25, 2015
When measuring competitive differentiation in milliseconds, connectivity is a key component for any financial services company’s data center strategy. In planning the move of its primary data center, a large international futures and commodities trading company needed to find a provider that could deliver the high capacity connectivity it required.
Tags : financial services, trade processing, data center, connectivity, data center, it management, data management
     Digital Realty
By: Dell EMC     Published Date: May 12, 2016
Businesses face greater uncertainty than ever. Market conditions, customer desires, competitive landscapes, and regulatory constraints change by the minute. So business success is increasingly contingent on predictive intelligence and hyperagile responsiveness to relentlessly evolving demands. This uncertainty has significant implications for the data center — especially as business becomes pervasively digital. IT has to support business agility by being more agile itself. It has to be able to add services, scale capacity up and down as needed, and nimbly remap itself to changes in organizational structure.
Tags : 
     Dell EMC
By: Dell Brought to you by Intel     Published Date: Dec 09, 2013
Database performance and memory capacity with the Intel Xeon Processor E5-2660V2- Powered Dell Poweredge M620.
Tags : dell, xeon processors e5-2660, database performance, intel xeon processor, poweredge m620., software development, it management
     Dell Brought to you by Intel
By: Dell EMC     Published Date: Aug 17, 2017
For many companies the appeal of the public cloud is very real. For tech startups, the cloud may be their only option, since many don’t have the capital or expertise to build and operate the IT systems their businesses need. Existing companies with established data centers are also looking at public clouds, to increase IT agility while limiting risk. The idea of building-out their production capacity while possibly reducing the costs attached to that infrastructure can be attractive. For most companies the cloud isn’t an “either-or” decision, but an operating model to be evaluated along with on-site infrastructure. And like most infrastructure decisions the question of cost is certainly a consideration. In this report we’ll explore that question, comparing the cost of an on-site hyperconverged solution with a comparable set up in the cloud. The on-site infrastructure is a Dell EMC VxRailTM hyperconverged appliance cluster and the cloud solution is Amazon Web Services (AWS).
Tags : public cloud, it systems, data center, it agility, hyperconverged solution, hyperconverged appliance
     Dell EMC
By: Butler Technologies     Published Date: Jul 03, 2018
MPO connectors increase your data capacity with a highly efficient use of space. But users have faced challenges such as extra complexities and time required for testing and troubleshooting multi-fiber networks. VIAVI helps overcome these challenges with the industry's most complete portfolio of test solutions for MPO connectivity.
Tags : 
     Butler Technologies
By: Dell Storage     Published Date: Apr 17, 2012
A scale-out storage architecture helps organizations deal with demands for growing data capacity and access. Dell engineers put DellT EqualLogicT scale-out storage through its paces to demonstrate its scalability in both file and block I/O scenarios.
Tags : storage
     Dell Storage
By: HP     Published Date: Jan 18, 2013
Today enterprises are more dependent on robust, agile IT solutions than ever before. It’s not just about technology—people and processes need to make the cloud journey too, and to realize the benefit of new technology, new support is needed.
Tags : data center, hp data center care, flexible capacity service, service, capacity, flexible, it management, data management
     HP
By: PernixData     Published Date: Jun 01, 2015
Storage arrays are struggling to keep up with virtualized data centers. The traditional solution of buying more capacity to get more performance is an expensive answer – with inconsistent results. A new approach is required to more cost effectively provide the storage performance you need, when and where you need it most.
Tags : pernixdata, sql, database, servers, architecture, data management
     PernixData
Start   Previous   1 2 3 4 5 6 7 8    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com