network

Results 1 - 25 of 4181Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Jul 02, 2015
The Cray XC series is a distributed memory system developed as part of Cray’s participation in the Defense Advanced Research Projects Agency’s (DARPA) High Productivity Computing System (HPCS) program. Previously codenamed “Cascade,” the Cray XC system is capable of sustained multi-petaflops performance and features a hybrid architecture combining multiple processor technologies, a high performance network and a high performance operating system and programming environment.
Tags : 
     Cray
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: InsideHPC Special Report     Published Date: Aug 17, 2016
A single issue has always defined the history of HPC systems: performance. While offloading and co-design may seem like new approaches to computing, they actually have been used, to a lesser degree, in the past as a way to enhance performance. Current co-design methods are now going deeper into cluster components than was previously possible. These new capabilities extend from the local cluster nodes into the “computing network.”
Tags : 
     InsideHPC Special Report
By: IBM     Published Date: Sep 02, 2014
This brief webcast will cover the new and enhanced capabilities of Elastic Storage 4.1, including native encryption and secure erase, flash-accelerated performance, network performance monitoring, global data sharing, NFS data migration and more. IBM GPFS (Elastic storage) may be the key to improving your organization's effectiveness and can help define a clear data management strategy for future data growth and support.
Tags : ibm, elastic storage
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Learn how organizations in cancer research, speech recognition, financial services, automotive design and more are using IBM solutions to improve business results. IBM Software Defined Infrastructure enables organizations to deliver IT services in the most efficient way possible, leveraging resource utilization to accelerate time to results and reduce costs. It is the foundation for a fully integrated software defined environment, optimizing compute, storage and networking infrastructure so organizations can quickly adapt to changing business requirements.
Tags : 
     IBM
By: Seagate     Published Date: Sep 30, 2015
Although high-performance computing (HPC) often stands apart from a typical IT infrastructure—it uses highly specialized scale-out compute, networking and storage resources—it shares with mainstream IT the ability to push data center capacity to the breaking point. Much of this is due to data center inefficiencies caused by HPC storage growth. The Seagate® ClusterStor™ approach to scale-out HPC storage can significantly improve data center efficiency. No other vendor solution offers the same advantages.
Tags : 
     Seagate
By: IBM     Published Date: Feb 13, 2015
Value is migrating throughout the IT industry from hardware to software and services. High Performance Computing (HPC) is no exception. IT solution providers must position themselves to maximize their delivery of business value to their clients – particularly industrial customers who often use several applications that must be integrated in a business workflow. This requires systems and hardware vendors to invest in making their infrastructure “application ready”. With its Application Ready solutions, IBM is outflanking competitors in Technical Computing and fast-tracking the delivery of client business value by providing an expertly designed, tightly integrated and performance optimized architecture for several key industrial applications. These Application Ready solutions come with a complete high-performance cluster including servers, network, storage, operating system, management software, parallel file systems and other run time libraries, all with commercial-level solution s
Tags : 
     IBM
By: General Atomics     Published Date: Jan 13, 2015
The term “Big Data” has become virtually synonymous with “schema on read” (where data is applied to a plan or schema as it is ingested or pulled out of a stored location) unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc. But what if you have unstructured data that, on its own, is hugely valuable, enduring, and created at great expense? Data that may not immediately be human readable or indexable on search? Exactly the kind of data most commonly created and analyzed in science and HPC. Research institutions are awash with such data from large-scale experiments and extreme-scale computing that is used for high-consequence
Tags : general atomics, big data, metadata, nirvana
     General Atomics
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: Data Direct Networks     Published Date: Dec 31, 2015
Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.
Tags : 
     Data Direct Networks
By: Data Direct Networks     Published Date: Dec 31, 2015
Parallelism and direct memory access enable faster and more accurate SAS analytics using Remote Direct Memory Access based analytics and fast, scalable,external disk systems with massively parallel access to data, SAS analytics driven organizations can deliver timely and accurate execution for data intensive workfl ows such as risk management, while incorporating larger datasets than using traditional NAS.
Tags : 
     Data Direct Networks
By: Data Direct Networks     Published Date: Dec 31, 2015
When it comes to generating increasingly larger data sets and stretching the limits of high performance computing (HPC), the fi eld of genomics and next generation sequencing (NGS) is in the forefront. The major impetus for this data explosion began in 1990 when the U.S. kicked off the Human Genome Project, an ambitious project designed to sequence the three billion base pairs that constitute the complete set of DNA in the human body. Eleven years and $3 billion later the deed was done. This breakthrough was followed by a massive upsurge in genomics research and development that included rapid advances in sequencing using the power of HPC. Today an individual’s genome can be sequenced overnight for less than $1,000.
Tags : 
     Data Direct Networks
By: Limelight     Published Date: Feb 16, 2018
Websites are indispensable for many companies to build their profits, but as the threat of cyber attacks increases, websites can also be a serious risk factor. Therefore companies need to simultaneously develop both the convenience and security of websites. This whitepaper outlines the optimal solution for smartly achieving these two aims at the same time.
Tags : content delivery network, cybersecurity, ddos, waf, web application firewall, cdn, distributed denial of serivde, cloud security
     Limelight
By: Limelight     Published Date: Feb 16, 2018
DDoS attacks have long been known as the main form of cyber attack risk. “The Financial Inspection Manual” revised by the Japanese government’s Financial Services Agency in April 2015, identifies the risk of "DDoS attacks", and the need to take countermeasures is strongly emphasized. Other government agencies also acknowledge the frequency and severity of DDoS attacks. However, a clear method to completely prevent DDoS attacks has not been established yet. Why is that? What are the best measures that companies can take at the present time?
Tags : content delivery network, cybersecurity, ddos, waf, web application firewall, cdn, distributed denial of serivde, cloud security
     Limelight
By: Limelight     Published Date: Feb 16, 2018
People today expect to have a compelling, interactive, and engaging digital experience. Few companies can exist without a website. In a lot of cases, the Internet is the main stream for their customers to gather information, and the performance of their website directly affects their business. So, what measures can companies take to prevent site delays and improve performance? This white paper will explain the mechanism of a CDN and points to consider when selecting a CDN service.
Tags : content delivery network, cdn, digital content delivery, mobile delivery, global content delivery, live streaming, video on demand, video delivery
     Limelight
By: Limelight     Published Date: Feb 16, 2018
When it comes to delivering digital content, downtime isn’t the only concern. Today a poor user experience can be just as damaging as an outage. According to Limelight research, 78% of people will stop watching an online video after it buffers three times, and the majority of people will not wait more than 5 seconds for a website to load. Organizations looking to deliver great digital experiences for their customers often choose to deliver that content using Content Delivery Networks (CDNs). Using multiple CDNs to deliver these digital content experiences promises even greater levels of availability and performance. But it brings with it a host of questions. In this paper we’ll explore the 5 things you should know about multi-CDN in order to determine if it might make sense for your business.
Tags : content delivery network, cdn, multi-cdn, multiple cdns, website performance, website acceleration, digital content delivery, mobile delivery
     Limelight
By: Limelight Networks     Published Date: Feb 21, 2018
Organizations looking to deliver great digital experiences for their customers often choose to deliver that content using Content Delivery Networks (CDNs). In some cases, using multiple CDNs to deliver these digital content experiences promises even greater levels of availability and performance. But how do you know if a multi-CDN strategy is right for your business? This free Guide will help!
Tags : 
     Limelight Networks
By: Limelight Networks     Published Date: Feb 21, 2018
Content delivery networks (CDNs) can significantly improve the user experience of your online audiences….but not all CDNs deliver the same level of service. Learn how to choose the right CDN for your business
Tags : 
     Limelight Networks
By: Limelight Networks     Published Date: Feb 21, 2018
From the boardroom to the backroom, everyone is looking for ways to protect their digital content from cyber threats like DDoS attacks, unauthorized access or theft. In this white paper, we’ll discuss ten different ways to protect your digital content, ensure high availability and maintain superior quality of experience for every digital visitor.
Tags : 
     Limelight Networks
By: Limelight Networks     Published Date: Feb 22, 2018
The State of Online Video is Limelight Networks’ latest in a series of surveys that explores consumer perceptions and behaviors around digital content.
Tags : 
     Limelight Networks
By: Limelight Networks     Published Date: Feb 22, 2018
In the increasingly competitive OTT market, competition for viewers is high. Providers must find ways to not just deliver compelling content, but to deliver compelling viewing experiences. In this whitepaper, you’ll learn about the critical challenges facing OTT providers today and how they can be overcome to provide the broadcast quality experiences viewers expect, regardless of the device in use or the viewers location in the world. Are you ready to keep your subscribers happy and away from your competition? Download this free white paper OTT 3.0: How to Build a Better Mousetrap and learn: Why personalized content discovery is so important to viewers – and to the success of your business How to avoid internet congestion by leveraging technologies like a CDN The importance of global network scale to meet spikes in consumer traffic The impact of advertising on viewer abandonment
Tags : 
     Limelight Networks
By: Pure Storage     Published Date: Jan 12, 2018
Data is growing at amazing rates and will continue this rapid rate of growth. New techniques in data processing and analytics including AI, machine and deep learning allow specially designed applications to not only analyze data but learn from the analysis and make predictions. Computer systems consisting of multi-core CPUs or GPUs using parallel processing and extremely fast networks are required to process the data. However, legacy storage solutions are based on architectures that are decades old, un-scalable and not well suited for the massive concurrency required by machine learning. Legacy storage is becoming a bottleneck in processing big data and a new storage technology is needed to meet data analytics performance needs.
Tags : reporting, artificial intelligence, insights, organization, institution, recognition
     Pure Storage
By: Pure Storage     Published Date: Jan 12, 2018
Deep learning opens up new worlds of possibility in artificial intelligence, enabled by advances in computational capacity, the explosion in data, and the advent of deep neural networks. But data is evolving quickly and legacy storage systems are not keeping up. Advanced AI applications require a modern all-fl ash storage infrastructure that is built specifically to work with high-powered analytics.
Tags : data, deep learning, automated, intelligence, pure storage
     Pure Storage
By: Pure Storage     Published Date: Jan 12, 2018
Interest in machine learning has exploded over the past decade. You see machine learning in computer science programs, industry conferences, and the Wall Street Journal almost daily. For all the talk about machine learning, many conflate what it can do with what they wish it could do. Fundamentally, machine learning is using algorithms to extract information from raw data and represent it in some type of model. We use this model to infer things about other data we have not yet modeled. Neural networks are one type of model for machine learning; they have been around
Tags : learning machines, automated intelligence, scalars, vectors, mathematics, classification, pure storage
     Pure Storage
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com