technology

Results 1 - 25 of 5490Sort Results By: Published Date | Title | Company Name
By: Seagate     Published Date: Jan 26, 2016
Finding oil and gas has always been a tricky proposition, given that reserves are primarily hidden underground, and often as not, under the ocean as well. The costs involved in acquiring rights to a site, drilling the wells, and operating them are considerable and has driven the industry to adopt advanced technologies for locating the most promising sites. As a consequence, oil and gas exploration today is essentially an exercise in scientific visualization and modeling, employing some of most advanced computational technologies available. High performance computing (HPC) systems are being used to fill these needs, primarily with x86-based cluster computers and Lustre storage systems. The technology is well developed, but the scale of the problem demands medium to large-sized systems, requiring a significant capital outlay and operating expense. The most powerful systems deployed by oil and gas companies are represented by petaflop-scale computers with multiple petabytes of attached
Tags : 
     Seagate
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: Altair     Published Date: Feb 12, 2014
A Cray-Altair Solution for Optimized External Aerodynamics Cray and Altair are leaders in providing the powerful, usable technology engineers need to perform external aerodynamics analysis with greater speed and accuracy. With Altair’s HyperWorks Virtual Wind Tunnel running on Cray XC30 or CS300 systems, manufacturers of all sizes can now predict a vehicle’s external aerodynamic performance and improve the cooling, comfort, visibility and stability features in their designs -- without the need for numerous physical wind tunnel tests.
Tags : cray-altair, optimized external aerodynamics, altair hyperworks
     Altair
By: IBM     Published Date: Jun 05, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
Docker is a lightweight Linux container technology built on top of LXC (LinuX Containers) and cgroup (control groups), which offers many attractive benefits for HPC environments. Find out more about how IBM Platform LSF® and Docker have been integrated outside the core of Platform LSF with a real world example involving the application BWA (bio-bwa.sourceforge.net). This step-by-step white paper provides details on how to get started with the IBM Platform LSF and Docker integration which is available via open beta on Service Management Connect.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
A*Star had high levels of user discontent and not enough computational resources for the population of users or the number of research projects. Platform LSF acted as the single unifying workload scheduler and helped rapidly increase resource utilization.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
The new Clusters for Dummies, e-book from IBM Platform Computing explains how clustering technology enables you to run higher quality simulations and shorten the time to discoveries. In this e-book, you’ll discover how to: Make a cluster work for your business Create clusters using commodity components Use workload management software for reliable results Use cluster management software for simplified administration Learn from case studies of clusters in action With clustering technology you can increase your compute capacity, accelerate innovation process, shrink time to insights, and improve your productivity, all of which will lead to increased competitiveness for your business.
Tags : 
     IBM
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: Dell and Intel®     Published Date: Nov 18, 2015
The NCSA Private Sector Program creates a high-performance computing cluster to help corporations overcome critical challenges. Through its Private Sector Program (PSP), NCSA has provided supercomputing, consulting, research, prototyping and development, and production services to more than one-third of the Fortune 50, in manufacturing, oil and gas, finance, retail/wholesale, bio/medical, life sciences, technology and other sectors. “We’re not the typical university supercomputer center, and PSP isn’t a typical group,” Giles says. “Our focus is on helping companies leverage highperformance computing in ways that make them more competitive.”
Tags : 
     Dell and Intel®
By: AMD     Published Date: Nov 09, 2015
Graphics Processing Units (GPUs) have become a compelling technology for High Performance Computing (HPC), delivering exceptional performance per watt and impressive densities for data centers. AMD has partnered up with Hewlett Packard Enterprise to offer compelling solutions to drive your HPC workloads to new levels of performance. Learn about the awe-inspiring performance and energy efficiency of the AMD FirePro™ S9150, found in multiple HPE servers including the popular 2U HPE ProLiant DL380 Gen9 server. See why open standards matter for HPC, and what AMD is doing in this area. Click here to read more on AMD FirePro™ server GPUs for HPE Proliant servers
Tags : 
     AMD
By: Avere Systems     Published Date: Jun 27, 2016
This white paper reviews common HPC-environment challenges and outlines solutions that can help IT professionals deliver best-in-class HPC cloud solutions—without undue stress and organizational chaos. The paper: • Identifies current issues—including data management, data center limitations, user expectations, and technology shifts- that stress IT teams and existing infrastructure across industries and HPC applications • Describes the potential cost savings, operational scale, and new functionality that cloud solutions can bring to big compute • Characterizes technical and other barriers to an all cloud infrastructure and describes how IT teams can leverage a hybrid cloud for compute power, maximum flexibility, and protection against locked-in scenarios
Tags : 
     Avere Systems
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: Red Hat     Published Date: Sep 09, 2018
As applications and services become more central to business strategy, and as distributed methodologies like agile and DevOps change the way teams operate, it is critical for IT leaders to find a way to integrate their backend systems, legacy systems, and teams in an agile, adaptable way. This e-book details an architecture called agile integration, consisting of three technology pillars—distributed integration, containers, and APIs—to deliver flexibility, scalability, and reusability.
Tags : 
     Red Hat
By: SAP     Published Date: Aug 14, 2018
As legacy ERP data structures and technology begin winding down, they are making way for a whole new generation of technology that brings exponential performance and dramatically new problem-solving capabilities. Download the whitepaper to discover the demands driving this evolution, explore the potential promised by next-gen ERP, and learn why ‘playing it safe’ is no longer an option.
Tags : 
     SAP
By: SAP     Published Date: Jul 23, 2018
Leading companies are making digital transformation a reality put data and intelligence at the center of their future. They are building new capabilities, skills, and technology, and evolving their culture to transform into an ‘Intelligent Enterprise’ and achieve the aforementioned outcomes. These companies are not only delivering short-term value to shareholders, but are also positioned to thrive and transform their industry. Explore how SAP can help you navigate the journey to the Intelligent Enterprise.
Tags : 
     SAP
By: Zendesk     Published Date: Jun 29, 2018
In the global market for customer service software, Zendesk is once again recognized as a leader in the 2018 Gartner Magic Quadrant for the CRM Customer Engagement Center. Every year, Gartner conducts a thorough analysis of service providers in the customer service and support application space. We believe the Gartner Magic Quadrant for the CRM Customer Engagement Center report provides valuable information for business leaders who seek technology solutions for interacting and engaging with their customers. Zendesk can again be found in the 2018 report’s Leader quadrant, which we consider a reflection of the success of our 125,000 customers, including enterprise clients like Airbnb, Tesco, and the University of Tennessee. The past year alone has included a number of significant milestones for us, including the release of AI-enhanced features for self-service and surpassing $500 million run rate in revenue. As our customer base continues to grow, we strive to be a dynamic vendor for bus
Tags : 
     Zendesk
By: Red Hat     Published Date: Aug 22, 2018
In the emerging digital enterprise, there’s a good chance some application development will be taking place outside the information technology department. It’s not that the role of IT is in any way being diminished – in fact, IT managers are getting busier than ever, overseeing the technology strategies of their enterprises. Rather, the pieces are in place for business users to build and configure the essential business applications they need, on a self-service basis, with minimal or no involvement of their IT departments. As the world moves deeper into an era of ongoing disruption from digital players – be they startups, or teams within established enterprises – technology has become an essential part of every job, from the boardroom to the boiler room. Accordingly, the discipline of IT is no longer confined to the data center or development shop. Many business managers and professionals are building, launching or downloading their own applications to achieve productivity and respond
Tags : 
     Red Hat
By: Akamai Technologies     Published Date: Jun 14, 2018
"Businesses continue to evolve as digital technologies reshape industries. The workforce is mobile, and speed and ef ciency are imperative, necessitating dynamic, cloud-based infrastructures and connectivity, as well as unhindered, secure application access — from anywhere, on any device, at any time. Leaders must remove hurdles to progress, but new business initiatives and processes increase the attack surface, potentially putting the company at risk.
Tags : digital technology, cloud, security, connectivity, authenticate
     Akamai Technologies
By: Akamai Technologies     Published Date: Jun 14, 2018
"Traditional remote access technologies—like VPNs, proxies, and remote desktops—provide access in much the same way they did 20 years ago. However, new and growing business realities—like a growing mobile and distributed workforce—are forcing enterprises to take a different approach to address the complexity and security challenges that traditional access technologies present. Read 5 Reasons Enterprises Need a New Access Model to learn about the fundamental changes enterprises need to make when providing access to their private applications."
Tags : vpn, proxies, security, security breach, technology
     Akamai Technologies
By: Amazon Web Services     Published Date: Jul 25, 2018
Defining the Data Lake “Big data” is an idea as much as a particular methodology or technology, yet it’s an idea that is enabling powerful insights, faster and better decisions, and even business transformations across many industries. In general, big data can be characterized as an approach to extracting insights from very large quantities of structured and unstructured data from varied sources at a speed that is immediate (enough) for the particular analytics use case.
Tags : 
     Amazon Web Services
By: TigerConnect     Published Date: Aug 15, 2018
By now you’ve seen The Joint Commission’s often-quoted statistic that nearly 80% of serious medical errors involve miscommunication during the handoff between providers and care settings. The newest clinical communication technology is designed to close dangerous communication gaps. Top clinical communication platforms offer advanced collaboration tools to ensure that critical patient information makes it safely from one care setting to another. Download this guide to learn how a clinical communication platform can significantly improve patient outcomes in 5 key areas.
Tags : 
     TigerConnect
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com