result

Results 1 - 25 of 2261Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Jul 02, 2015
As global energy costs climb, Cray has taken its long-standing expertise in optimizing power and cooling and focused it on developing overall system energy efficiency. The resulting Cray XC supercomputer series integrates into modern datacenters and achieves high levels of efficiency while minimizing system and infrastructure costs.
Tags : 
     Cray
By: Altair     Published Date: Jul 15, 2014
Impact analysis or drop testing is one of the most important stages of product design and development, and software that can simulate this testing accurately yields dramatic cost and time-to-market benefits for manufacturers. Dell, Intel and Altair have collaborated to analyze a virtual drop test solution with integrated simulation and optimization analysis, delivering proven gains in speed and accuracy. With this solution, engineers can explore more design alternatives for improved product robustness and reliability. As a result, manufacturers can significantly reduce the time to develop high-performing designs, improving product quality while minimizing time to delivery
Tags : 
     Altair
By: Bright Computing     Published Date: May 05, 2014
A successful HPC cluster is a powerful asset for an organization. The following essential strategies are guidelines for the effective operation of an HPC cluster resource: 1. Plan To Manage the Cost of Software Complexity 2. Plan for Scalable Growth 3. Plan to Manage Heterogeneous Hardware/Software Solutions 4. Be Ready for the Cloud 5. Have an answer for the Hadoop Question Bright Cluster Manager addresses the above strategies remarkably well and allows HPC and Hadoop clusters to be easily created, monitored, and maintained using a single comprehensive user interface. Administrators can focus on more sophisticated, value-adding tasks rather than developing homegrown solutions that may cause problems as clusters grow and change. The end result is an efficient and successful HPC cluster that maximizes user productivity.
Tags : bright computing, hpc clusters
     Bright Computing
By: IBM     Published Date: Jun 05, 2014
This webcast short spotlights ways you can accelerate results and optimize your infrastructure and how IBM Platform Computing software and IBM Technical Computing solutions are helping to simplify the complexity of deploying and managing a high-performance IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
An executive summary of the results of a survey conducted by Desktop Engineering to gauge its audience’s familiarity with high performance cluster computing and its benefits.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
IBM Platform Symphony is a high performance SOA grid server that optimizes application performance and resource sharing. Platform Symphony runs distributed application services on a scalable, shared, heterogeneous grid and accelerates a wide variety of parallel applications, quickly computing results while making optimal use of available infrastructure. Platform Symphony Developer Edition enables developers to rapidly develop and test applications without the need for a production grid. After applications are running in the Developer Edition, they are guaranteed to run at scale once published to a scaled-out Platform Symphony grid. Platform Symphony Developer Edition also enables developers to easily test and verify Hadoop MapReduce applications against IBM Platform Symphony. By leveraging IBM Platform's Symphony's proven, low-latency grid computing solution, more MapReduce jobs can run faster, frequently with less infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
This two year research initiative in collaboration with IBM focuses on key trends, best practices, challenges, and priorities in enterprise risk management and covers topics as diverse as culture, organizational structure, data, systems, and processes.
Tags : ibm, chartis, rick enabled enterprise
     IBM
By: IBM     Published Date: Sep 16, 2015
The IBM Spectrum Scale solution provided for up to 11x better throughput results than EMC Isilon for Spectrum Protect (TSM) workloads. Using published data, Edison compared a solution comprised of EMC® Isilon® against an IBM® Spectrum Scale™ solution. (IBM Spectrum Scale was formerly IBM® General Parallel File System™ or IBM® GPFS™, also known as code name Elastic Storage). For both solutions, IBM® Spectrum Protect™ (formerly IBM Tivoli® Storage Manager or IBM® TSM®) is used as a common workload performing the backups to target storage systems evaluated.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Learn how organizations in cancer research, speech recognition, financial services, automotive design and more are using IBM solutions to improve business results. IBM Software Defined Infrastructure enables organizations to deliver IT services in the most efficient way possible, leveraging resource utilization to accelerate time to results and reduce costs. It is the foundation for a fully integrated software defined environment, optimizing compute, storage and networking infrastructure so organizations can quickly adapt to changing business requirements.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
A high performance computing (HPC) cluster refers to a group of servers built from off-the-shelf components that are connected via certain interconnect technologies. A cluster can deliver aggregated computing power from its many processors with many cores — sometimes hundreds, even thousands — to meet the processing demands of more complex engineering software, and therefore deliver results faster than individual workstations. If your company is in the majority that could benefit from access to more computing power, a cluster comprised of commodity servers may be a viable solution to consider, especially now that they’re easier to purchase, deploy, configure and maintain than ever before. Read more and learn about the '5 Easy Steps to a High Performance Cluster'.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
The new Clusters for Dummies, e-book from IBM Platform Computing explains how clustering technology enables you to run higher quality simulations and shorten the time to discoveries. In this e-book, you’ll discover how to: Make a cluster work for your business Create clusters using commodity components Use workload management software for reliable results Use cluster management software for simplified administration Learn from case studies of clusters in action With clustering technology you can increase your compute capacity, accelerate innovation process, shrink time to insights, and improve your productivity, all of which will lead to increased competitiveness for your business.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Mar 30, 2015
Dell has teamed with Intel® to create innovative solutions that can accelerate the research, diagnosis and treatment of diseases through personalized medicine. The combination of leading-edge Intel® Xeon® processors and the systems and storage expertise from Dell create a state-of-the-art data center solution that is easy to install, manage and expand as required. Labelled the Dell Genomic Data Analysis Platform (GDAP), this solution is designed to achieve fast results with maximum efficiency. The solution is architected to solve a number of customer challenges, including the perception that implementation must be large-scale in nature, compliance, security and clinician uses.
Tags : 
     Dell and Intel®
By: Cisco EMEA     Published Date: Mar 26, 2019
Most organizations have invested, and continue to invest, in people, processes, technology, and policies to meet customer privacy requirements and avoid significant fines and other penalties. In addition, data breaches continue to expose the personal information of millions of people, and organizations are concerned about the products they buy, services they use, people they employ, and with whom they partner and do business with generally. As a result, customers are asking more questions during the buying cycle about how their data is captured, used, transferred, shared, stored, and destroyed. In last year’s study (Cisco 2018 Privacy Maturity Benchmark Study), Cisco introduced data and insights regarding how these privacy concerns were negatively impacting the buying cycle and timelines. This year’s research updates those findings and explores the benefits associated with privacy investment. Cisco’s Data Privacy Benchmark Study utilizes data from Cisco’s Annual Cybersecurity Benchma
Tags : 
     Cisco EMEA
By: TIBCO Software     Published Date: Mar 15, 2019
On-demand Webinar The current trend in manufacturing is towards tailor-made products in smaller lots with shorter delivery times. This change may lead to frequent production modifications resulting in increased machine downtime, higher production cost, product waste—and the need to rework faulty products. Watch this webinar to learn how TIBCO’s Smart Manufacturing solutions can help you overcome these challenges. You will also see a demonstration of TIBCO technology in action around improving yield and optimizing processes while also saving costs. What You Will Learn: Applying advanced analytics & machine learning / AI techniques to optimize complex manufacturing processes How multi-variate statistical process control can help to detect deviations from a baseline How to monitor in real time the OEE and produce a 360 view of your factory The webinar also highlights customer case studies from our clients who have already successfully implemented process optimization models. Speakers:
Tags : 
     TIBCO Software
By: Rubrik EMEA     Published Date: Apr 15, 2019
From stolen consumer data to sensitive data leaks, it seems that no one’s data has been safe in recent years. For numerous reasons, like misconfigured storage repositories and unpatched vulnerabilities, this trend is likely to continue. The integration of digital technology into all areas of business has resulted in more of our data being stored on computers and websites targeted by hackers, which has significantly increased the number of data breaches as well as organizations’ vulnerability to malware attacks. For example, the Equifax breach impacted 145 MM consumers, and with more employees working remotely on a wide range of devices, the threat landscape has expanded. The meteoric rise of the public cloud has compounded this issue, as data security requires new knowledge and skill sets in short supply, often leading to misconfigured and insecure solutions. Companies need to adopt the approach that every piece of data in their possession, on-premises or in the cloud, must be encryp
Tags : encryption, data, key, cloud, bits, keys, ciphertext, entropy, plaintext, software
     Rubrik EMEA
By: 3D Systems     Published Date: May 15, 2019
Investment casting is a precise manufacturing methodology that delivers value across industries, from mechanical, automotive and aerospace parts to intricate dental work, jewelry and sculpture. For centuries the trade-off for smooth and accurate investment casted parts has been high costs and long casting pattern lead times. Now the evolution of parts is accelerating dramatically in many industries resulting in shorter product life cycles and lower volumes of casted parts between cycles. Waiting for tooling for obsolete parts for aging aircraft also mean delays for aircraft to be repaired, costing time and money. Demand for faster foundry production is increasing in all industries and foundries need to be ready to respond. To find out how 3D Sysytems can help your business, download this eBook today.
Tags : 
     3D Systems
By: Dell EMC     Published Date: May 08, 2019
IDC reports companies that modernize IT infrastructure for emerging technologies such as AI thrive and see results such as launching IT services faster and closing more deals. Access this comprehensive whitepaper from Dell and Intel® to learn more.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: May 09, 2019
Disaster recovery (DR) and long-term retention of data, for security as well as regulatory compliance, can be very challenging for mid-sized organizations. Keeping a secondary site up for DR can get very expensive and dealing with tape can be slow and costly. As a result, many mid-sized organizations are looking to the efficiencies of the cloud, such as scale, elasticity, agility, and lower initial storage costs, to expand their data protection environments. Check out this easy-to-absorb infographic to learn how to achieve powerful, converged, easy to deploy and manage, cloud-ready data protection.
Tags : 
     Dell EMC
By: Schneider Electric     Published Date: Jun 03, 2019
Since the sepia-toned days of the early 19th century, industry has sought effective ways to control manufacturing and production processes. New technology has greatly influenced factories and plants, resulting in new operational approaches to maximize benefits and achieve 100% ROI in a very short time. Download the white paper to learn more.
Tags : process control, empowered operators, optimized assets, future of automation, reliability, safety, cybersecurity, operational profitability, ecosttruxure plant, process automation
     Schneider Electric
By: Schneider Electric     Published Date: Jun 03, 2019
The safety of operations can have a direct, positive impact on the operational profitability of the plant. Environmental health and safety (EH&S) can now be viewed not just as a cost center, but as a profit center, and new levels of both safety and profitability can result. Real-time safe profitability is no longer a dream—it is a reality!
Tags : smart control, empowered operators, optimized assets, future of automation, reliability, safety, cybersecurity, operational profitability, ecostruxure plant, process automation, profitable safety, defence in depth, industrial automation, process control, process systems, safety instrumented systems
     Schneider Electric
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com