data system

Results 1 - 25 of 1221Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Jul 02, 2015
As global energy costs climb, Cray has taken its long-standing expertise in optimizing power and cooling and focused it on developing overall system energy efficiency. The resulting Cray XC supercomputer series integrates into modern datacenters and achieves high levels of efficiency while minimizing system and infrastructure costs.
Tags : 
     Cray
By: Green Revolution Cooling     Published Date: May 12, 2014
Download Green Revolution Cooling’s White Paper “Data Center Floor Space Utilization – Comparing Density in Liquid Submersion and Air Cooling Systems” to learn about the density of liquid submersion cooling, how looks can be deceiving and how, more times than not, liquid cooling once again has air beat.
Tags : green revolution, data center floor space
     Green Revolution Cooling
By: Intel     Published Date: Aug 06, 2014
Designing a large-scale, high-performance data storage system presents significant challenges. This paper describes a step-by-step approach to designing such a system and presents an iterative methodology that applies at both the component level and the system level. A detailed case study using the methodology described to design a Lustre storage system is presented.
Tags : intel, high performance storage
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: IBM     Published Date: Jun 05, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
This two year research initiative in collaboration with IBM focuses on key trends, best practices, challenges, and priorities in enterprise risk management and covers topics as diverse as culture, organizational structure, data, systems, and processes.
Tags : ibm, chartis, rick enabled enterprise
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: SGI     Published Date: Mar 03, 2015
The SGI UV system is uniquely suited for bioinformatics and genomics by providing the computational capabilities and global shared memory architecture needed for even the most demanding sequencing and analytic tasks, including post sequencing and other data intensive workflows. Because of the systems outstanding speed and throughput, genomics researchers can perform very large jobs in less time, realizing a dramatically accelerated time-to-solution. And best of all, they can explore avenues of research that were computationally beyond the reach of HPC systems lacking the power and in-memory capabilities of the SGI UV.
Tags : 
     SGI
By: IBM     Published Date: Nov 14, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : 
     IBM
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: HP     Published Date: Oct 08, 2015
Administrators, engineers and executives are now tasked with solving some of the world’s most complex challenges. This could revolve around advanced computations for science, business, education, pharmaceuticals and beyond. Here’s the challenge – many data centers are reaching peak levels of resource consumption; and there’s more work to be done. So how are engineers and scientists supposed to continue working around such high-demand applications? How can they continue to create ground-breaking research while still utilizing optimized infrastructure? How can a platform scale to the new needs and demands of these types of users and applications. This is where HP Apollo Systems help reinvent the modern data center and accelerate your business.
Tags : apollo systems, reinventing hpc and the supercomputer, reinventing modern data center
     HP
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: BitStew     Published Date: Jun 23, 2016
If we examine the root cause of the IT organization’s pain, it continues to be the inability to integrate data from many disparate and operationally focused systems. A recent survey found that only 18% of the respondents had extensively integrated their analytics initiatives across operations and realized their objectives, which pointed to the diversity of datasets and formats and poor data quality. Indeed, data integration is causing the analytics burden to be placed on IT professionals. Download the white paper to learn more.
Tags : 
     BitStew
By: MalwareBytes EMEA     Published Date: May 10, 2019
INDUSTRY Education BUSINESS CHALLENGE Protect student data from threats posed by malware on teachers’ MacBook laptops IT ENVIRONMENT Avast antivirus, enterprise network security layers SOLUTION Malwarebytes Incident Response RESULTS Removed PUPs and malware from hundreds of Mac systems in just minutes Delivered instant visibility into connected systems and quarantined malware Reduced risk with ability to proactively detect and remediate threats
Tags : 
     MalwareBytes EMEA
By: IBM APAC     Published Date: May 14, 2019
If anything is certain about the future, it’s that there will be more complexity, more data to manage and greater pressure to deliver instantly. The hardware you buy should meet today’s expectations and prepare you for whatever comes next. Power Systems are built for the most demanding, data-intensive, computing on earth. Our cloudready servers help you unleash insight from your data pipeline — from managing mission-critical data, to managing your operational data stores and data lakes, to delivering the best server for cognitive computing. With industry leading reliability and security, our infrastructure is designed to crush the most data-intensive workloads imaginable, while keeping your business protected. - Simplified Multicloud - Built-in end-to-end security - Proven Reliability - Industry-leading value and performance
Tags : 
     IBM APAC
By: Group M_IBM Q2'19     Published Date: Apr 08, 2019
Empowering the Automotive Industry through Intelligent Orchestration With the increasing complexity and volume of cyberattacks, organizations must have the capacity to adapt quickly and confidently under changing conditions. Accelerating incident response times to safeguard the organization's infrastructure and data is paramount. Achieving this requires a thoughtful plan- one that addresses the security ecosystem, incorporates security orchestration and automation, and provides adaptive workflows to empower the security analysts. In the white paper "Six Steps for Building a Robust Incident Response Function" IBM Resilient provides a framework for security teams to build a strong incident response program and deliver organization-wide coordination and optimizations to accomplish these goals.
Tags : 
     Group M_IBM Q2'19
By: Blue Prism     Published Date: Mar 28, 2019
Robotic process automation describes the use of technology to automate tasks that are traditionally done by a human being. The technology itself mimics an end user by simulating user actions such as navigating within an application or entering data into forms according to a set of rules. RPA is often used to automate routine administrative tasks that typically require a human being to interact with multiple systems, but RPA technology is evolving to support the automation of increasingly sophisticated processes at scale within enterprise architectures rather than on the desktop. Over the past two years, RPA has been adopted by a number of business process outsourcing (BPO) providers and a growing number of end-user organizations are now deploying the technology themselves to create “virtual workforces” of robotic workers.
Tags : 
     Blue Prism
By: NetApp     Published Date: Dec 18, 2013
IT managers have indicated their two most significant challenges associated with managing unstructured data at multiple locations were keeping pace with data growth and improving data protection . Learn how the NetApp Distributed Content Repository provides advanced data protection and system recovery capabilities that can enable multiple data centers and remote offices to maintain access to data through hardware and software faults. Key benefits are: - continuous access to file data while maintaining data redundancy with no administrator intervention needed. - easily integrated and deployed into a distributed environment, providing transparent, centrally managed content storage - provision of secure multi-tenancy using security partitions. - provision effectively infinite, on-demand capacity while providing fast access to files and objects in the cloud. - secure, robust data protection techniques that enable data to persist beyond the life of the storage it resides on
Tags : 
     NetApp
By: ASG Software Solutions     Published Date: Feb 24, 2010
A recent survey of CIOs found that over 75% want to develop an overall information strategy in the next three years, yet over 85% are not close to implementing an enterprise-wide content management strategy. Meanwhile, data runs rampant, slows systems, and impacts performance. Hard-copy documents multiply, become damaged, or simply disappear.
Tags : asg, cmdb, bsm, itil, bsm, metacmdb, archiving, sap, ilm, mobius, workload automation, wla, visibility, configuration management, metadata, metacmdb, lob, sdm, service dependency mapping, ecommerce
     ASG Software Solutions
By: APC by Schneider Electric     Published Date: Feb 09, 2011
In the broadening data center cost-saving and energy efficiency discussion, data center physical infrastructure preventive maintenance (PM) is sometimes neglected as an important tool for controlling TCO and downtime. PM is performed specifically to prevent faults from occurring. IT and facilities managers can improve systems uptime through a better understanding of PM best practices.
Tags : absolute, data protection, maitenance, ups, physical infrastructure, infrastructure, data loss protection, dlp, ponemon, it security, sensitive data
     APC by Schneider Electric
By: SAP     Published Date: Mar 09, 2017
Learn how CIOs can set up a system infrastructure for their business to get the best out of Big Data. Explore what the SAP HANA platform can do, how it integrates with Hadoop and related technologies, and the opportunities it offers to simplify your system landscape and significantly reduce cost of ownership.
Tags : 
     SAP
By: Spectrum Enterprise     Published Date: Oct 29, 2018
Bandwidth. Speed. Throughput. These terms are not interchangeable. They are interrelated concepts in data networking that help measure capacity, the time it takes to get from one point to the next and the actual amount of data you’re receiving, respectively. When you buy an Internet connection from Spectrum Enterprise, you’re buying a pipe between your office and the Internet with a set capacity, whether it is 25 Mbps, 10 Gbps, or any increment in between. However, the bandwidth we provide does not tell the whole story; it is the throughput of the entire system that matters. Throughput is affected by obstacles, overhead and latency, meaning the throughput of the system will never equal the bandwidth of your Internet connection. The good news is that an Internet connection from Spectrum Enterprise is engineered to ensure you receive the capacity you purchase; we proactively monitor your bandwidth to ensure problems are dealt with promptly, and we are your advocates across the Internet w
Tags : 
     Spectrum Enterprise
By: Hewlett Packard Enterprise     Published Date: Jan 31, 2019
"IT needs to reach beyond the traditional data center and the public cloud to form and manage a hybrid connected system stretching from the edge to the cloud, wherever the cloud may be. We believe this is leading to a new period of disruption and development that will require organizations to rethink and modernize their infrastructure more comprehensively than they have in the past. Hybrid cloud and hybrid cloud management will be the key pillars of this next wave of digital transformation – which is on its way much sooner than many have so far predicted. They have an important role to play as part of a deliberate and proactive cloud strategy, and are essential if the full benefits of moving over to a cloud model are to be fully realized."
Tags : 
     Hewlett Packard Enterprise
By: Cisco EMEA     Published Date: Nov 13, 2017
The HX Data Platform uses a self-healing architecture that implements data replication for high availability, remediates hardware failures, and alerts your IT administrators so that problems can be resolved quickly and your business can continue to operate. Space-efficient, pointerbased snapshots facilitate backup operations, and native replication supports cross-site protection. Data-at-rest encryption protects data from security risks and threats. Integration with leading enterprise backup systems allows you to extend your preferred data protection tools to your hyperconverged environment.
Tags : hyperflex, systems, data platform, storage efficiency, business, cisco
     Cisco EMEA
By: BigCommerce     Published Date: Oct 16, 2018
Businesses who have lived through the evolution of the digital age are well aware that we’ve experienced a generational shift in technology. The rise of software as a service (SaaS), cloud, mobile, big data, the Internet of Things (IoT), social media, and other technologies have disrupted industries and changed customers’ expectations. In our always-on, buy anything anywhere world, customers want their shopping experiences to be personalized, dynamic, and convenient. As a result, many businesses are trying to reinvent themselves. Success in a fast-paced economy depends on continually adapting and innovating. Companies have to move quickly to keep up; there’s no time for disjointed technologies and old systems that don’t serve the customer-obsessed mentality needed to thrive in the digital age.
Tags : 
     BigCommerce
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com