data system

Results 1 - 25 of 1289Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Jul 02, 2015
As global energy costs climb, Cray has taken its long-standing expertise in optimizing power and cooling and focused it on developing overall system energy efficiency. The resulting Cray XC supercomputer series integrates into modern datacenters and achieves high levels of efficiency while minimizing system and infrastructure costs.
Tags : 
     Cray
By: Green Revolution Cooling     Published Date: May 12, 2014
Download Green Revolution Cooling’s White Paper “Data Center Floor Space Utilization – Comparing Density in Liquid Submersion and Air Cooling Systems” to learn about the density of liquid submersion cooling, how looks can be deceiving and how, more times than not, liquid cooling once again has air beat.
Tags : green revolution, data center floor space
     Green Revolution Cooling
By: Intel     Published Date: Aug 06, 2014
Designing a large-scale, high-performance data storage system presents significant challenges. This paper describes a step-by-step approach to designing such a system and presents an iterative methodology that applies at both the component level and the system level. A detailed case study using the methodology described to design a Lustre storage system is presented.
Tags : intel, high performance storage
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: IBM     Published Date: Jun 05, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
This two year research initiative in collaboration with IBM focuses on key trends, best practices, challenges, and priorities in enterprise risk management and covers topics as diverse as culture, organizational structure, data, systems, and processes.
Tags : ibm, chartis, rick enabled enterprise
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: SGI     Published Date: Mar 03, 2015
The SGI UV system is uniquely suited for bioinformatics and genomics by providing the computational capabilities and global shared memory architecture needed for even the most demanding sequencing and analytic tasks, including post sequencing and other data intensive workflows. Because of the systems outstanding speed and throughput, genomics researchers can perform very large jobs in less time, realizing a dramatically accelerated time-to-solution. And best of all, they can explore avenues of research that were computationally beyond the reach of HPC systems lacking the power and in-memory capabilities of the SGI UV.
Tags : 
     SGI
By: IBM     Published Date: Nov 14, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : 
     IBM
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: HP     Published Date: Oct 08, 2015
Administrators, engineers and executives are now tasked with solving some of the world’s most complex challenges. This could revolve around advanced computations for science, business, education, pharmaceuticals and beyond. Here’s the challenge – many data centers are reaching peak levels of resource consumption; and there’s more work to be done. So how are engineers and scientists supposed to continue working around such high-demand applications? How can they continue to create ground-breaking research while still utilizing optimized infrastructure? How can a platform scale to the new needs and demands of these types of users and applications. This is where HP Apollo Systems help reinvent the modern data center and accelerate your business.
Tags : apollo systems, reinventing hpc and the supercomputer, reinventing modern data center
     HP
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: BitStew     Published Date: Jun 23, 2016
If we examine the root cause of the IT organization’s pain, it continues to be the inability to integrate data from many disparate and operationally focused systems. A recent survey found that only 18% of the respondents had extensively integrated their analytics initiatives across operations and realized their objectives, which pointed to the diversity of datasets and formats and poor data quality. Indeed, data integration is causing the analytics burden to be placed on IT professionals. Download the white paper to learn more.
Tags : 
     BitStew
By: Oracle     Published Date: Sep 25, 2019
Research shows that legacy ERP 1.0 systems were not designed for usability and insight. More than three quarters of business leaders say their current ERP system doesn’t meet their requirements, let alone future plans 1. These systems lack modern best-practice capabilities needed to compete and grow. To enable today’s data-driven organization, the very foundation from which you are operating needs to be re-established; it needs to be “modernized”. Oracle’s goal is to help you navigate your own journey to modernization by sharing the knowledge we’ve gained working with many thousands of customers using both legacy and modern ERP systems. To that end, we’ve crafted this handbook outlining the fundamental characteristics that define modern ERP.
Tags : 
     Oracle
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"Security analysts have a tougher job than ever. New vulnerabilities and security attacks used to be a monthly occurrence, but now they make the headlines almost every day. It’s become much more difficult to effectively monitor and protect all the data passing through your systems. Automated attacks from bad bots that mimic human behavior have raised the stakes, allowing criminals to have machines do the work for them. Not only that, these bots leave an overwhelming number of alert bells, false positives, and inherent stress in their wake for security practitioners to sift through. Today, you really need a significant edge when combating automated threats launched from all parts of the world. Where to start? With spending less time investigating all that noise in your logs."
Tags : 
     F5 Networks Singapore Pte Ltd
By: Gigamon     Published Date: Sep 03, 2019
Network performance and security are vital elements of any business. Organisations are increasingly adopting virtualisation and cloud technologies to boost productivity, cost savings and market reach. With the added complexity of distributed network architectures, full visibility is necessary to ensure continued high performance and security. Greater volumes of data, rapidlyevolving threats and stricter regulations have forced organisations to deploy new categories of security tools, e.g. Web Access Firewalls (WAFs) or Intrusion Prevention Systems (IPS). Yet, simply adding more security tools may not always be the most efficient solution.
Tags : 
     Gigamon
By: BehavioSec     Published Date: Oct 04, 2019
In this case study, a large enterprise with an increasing amount of off-site work from both work-related travel and a fast-growing remote workforce, is faced with a unique challenge to ensure their data security is scalable and impenetrable. Their data access policies rely on physical access management provided at the company offices and do not always provide off-site employees with the ability to complete work-critical tasks. Legacy security solutions only add burden to productivity, sometimes causing employees to ignore security protocols in order to simply complete their work. Upon evaluating security vendors for a frictionless solution, they selected BehavioSec for its enterprise-grade capabilities with on-premise deployment and integration with existing legacy risk management systems.
Tags : 
     BehavioSec
By: TIBCO Software     Published Date: Jul 22, 2019
AA Ireland specializes in home, motor, and travel insurance and provides emergency rescue for people in their homes and on the road, attending to over 140,000 car break downs every year, 80% of which are fixed on-the-spot. “In each of the last five years, the industry lost a quarter billion in motor insurance," says Colm Carey, chief analytics officer. "So, there's a huge push for new data, models, ways to segment and pick profitable customer types—and get a lot more sophisticated. Our goal is to optimize pricing, understand the types of customers we're bringing, and the types we're trying to attract. We would like to tie that across the business. Marketing will run a campaign, trying to attract a lot of customers, but maybe they're not the right type. "We wanted to step away from industry standard software and go with something that was powerful and future-proof. In 2016, we had an opportunity to analyze all software. We chose the TIBCO® System of Insight with TIBCO BusinessWorks™ i
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Jul 22, 2019
Today, you can improve product quality and gain better control of the entire manufacturing chain with data virtualization, machine learning, and advanced data analytics. With all relevant data aggregated, analyzed, and acted on, sensors, devices, people, and processes become part of a connected Smart Factory ecosystem providing: •? Increased uptime, reduced downtime •? Minimized surplus and defects •? Better yields •? Reduced cost due to better quality •? Fewer deviations and less non-conformance
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Jul 22, 2019
Over the past decade there has been a major transformation in the manufacturing industry. Data has enabled a paradigm shift, with real-time IoT sensor data and machine learning algorithms delivering new insights for process and product optimization. Smart Manufacturing, also known as Industry 4.0, has laid the groundwork for the next industrial revolution. Using a smart factory system, all relevant data is aggregated, analyzed, and acted upon. We call this Manufacturing Intelligence, which gives decision-makers a competitive edge to: Digitize the business Optimize costs Accelerate innovation Survive digital disruption Watch this webinar to understand use cases and their underlying technology that helped our customers become smart manufacturers.
Tags : 
     TIBCO Software
By: TIBCO Software     Published Date: Aug 02, 2019
As an insurer, the challenges you face today are unprecedented. Siloed and heterogeneous existing systems make understanding what’s going on inside and outside your business difficult and costly. Your systems weren’t set up to take advantage of, or even handle, the volume, velocity, and variety of new data streaming in from the internet of things, sensors, wearables, telematics, weather, social media, and more. And they weren’t designed for heavy human interaction. Millennials demand immediate information and services across digital channels. Can your systems keep up?
Tags : 
     TIBCO Software
By: Intel     Published Date: Sep 27, 2019
The stakes are high in today's data centers. Organisations have access to massive quantities of data promising valuable insights and new opportunities for business. But data center architects need to rethink and redesign their system architectures to ingest, store and process all that information. Similarly, application owners need to assess how they can process data more effectively. Those who don't re-architect might find themselves scrambling just to keep from being drowned in a data deluge.
Tags : 
     Intel
By: Intel     Published Date: Sep 30, 2019
Mountains of data promise valuable insights and innovation for businesses that rethink and redesign their system architectures. But companies that don’t re-architect might find themselves scrambling just to keep from being buried in the avalanche of data. The problem is not just in storing raw data, though. For businesses to stay competitive, they need to quickly and cost-effectively access and process all that data for business insights, research, artificial intelligence (AI), and other uses. Both memory and storage are required to enable this level of processing, and companies struggle to balance high costs against limited capacities and performance constraints. The challenge is even more daunting because different types of memory and storage are required for different workloads. Furthermore, multiple technologies might be used together to achieve the optimal tradeoff in cost versus performance. Intel is addressing these challenges with new memory and storage technologies that emp
Tags : 
     Intel
By: Infinidat EMEA     Published Date: May 14, 2019
Big Data and analytics workloads represent a new frontier for organizations. Data is being collected from sources that did not exist 10 years ago. Mobile phone data, machine-generated data, and website interaction data are all being collected and analyzed. In addition, as IT budgets are already under pressure, Big Data footprints are getting larger and posing a huge storage challenge. This paper provides information on the issues that Big Data applications pose for storage systems and how choosing the correct storage infrastructure can streamline and consolidate Big Data and analytics applications without breaking the bank.
Tags : 
     Infinidat EMEA
By: Infinidat EMEA     Published Date: May 14, 2019
Continuous data availability is a key business continuity requirement for storage systems. It ensures protection against downtime in case of serious incidents or disasters and enables recovery to an operational state within a reasonably short period. To ensure continuous availability, storage solutions need to meet resiliency, recovery, and contingency requirements outlined by the organization.
Tags : 
     Infinidat EMEA
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com