fast it

Results 1 - 25 of 1441Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Aug 22, 2014
Learn how to optimize codes for faster application performance with Intel® Xeon® Phi™ coprocessor.
Tags : application performance, intel® xeon® phi™ coprocessor
     Cray
By: Intel     Published Date: Aug 06, 2014
Purpose-built for use with the dynamic computing resources available from Amazon Web Services ™ the Intel Lustre* solution provides the fast, massively scalable storage software needed to accelerate performance, even on complex workloads. Intel is a driving force behind the development of Lustre, and committed to providing fast, scalable, and cost effective storage with added support and manageability. Intel ® Enterprise Edition for Lustre* software undation for dynamic AWS-based workloads. Now you can innovate on your problem, not your infrastructure
Tags : intel, cloud edition lustre software, scalable storage software
     Intel
By: IBM     Published Date: Jun 05, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
IBM Platform Symphony is a high performance SOA grid server that optimizes application performance and resource sharing. Platform Symphony runs distributed application services on a scalable, shared, heterogeneous grid and accelerates a wide variety of parallel applications, quickly computing results while making optimal use of available infrastructure. Platform Symphony Developer Edition enables developers to rapidly develop and test applications without the need for a production grid. After applications are running in the Developer Edition, they are guaranteed to run at scale once published to a scaled-out Platform Symphony grid. Platform Symphony Developer Edition also enables developers to easily test and verify Hadoop MapReduce applications against IBM Platform Symphony. By leveraging IBM Platform's Symphony's proven, low-latency grid computing solution, more MapReduce jobs can run faster, frequently with less infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
A fast, simple, scalable and complete storage solution for today’s data-intensive enterprise IBM Spectrum Scale is used extensively across industries worldwide. Spectrum Scale simplifies data management with integrated tools designed to help organizations manage petabytes of data and billions of files—as well as control the cost of managing these ever-growing data volumes.
Tags : 
     IBM
By: RYFT     Published Date: Apr 03, 2015
The new Ryft ONE platform is a scalable 1U device that addresses a major need in the fast-growing market for advanced analytics — avoiding I/O bottlenecks that can seriously impede analytics performance on today's hyperscale cluster systems. The Ryft ONE platform is designed for easy integration into existing cluster and other server environments, where it functions as a dedicated, high-performance analytics engine. IDC believes that the new Ryft ONE platform is well positioned to exploit the rapid growth we predict for the high-performance data analysis market.
Tags : ryft, ryft one platform, 1u deivce, advanced analytics, avoiding i/o bottlenecks, idc
     RYFT
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: Adaptive Computing     Published Date: Feb 21, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: NVIDIA & Bright Computing     Published Date: Sep 01, 2015
As of June 2015, the second fastest computer in the world, as measured by the Top500 list employed NVIDIA® GPUs. Of those systems on the same list that use accelerators 60% use NVIDIA GPUs. The performance kick provided by computing accelerators has pushed High Performance Computing (HPC) to new levels. When discussing GPU accelerators, the focus is often on the price-toperformance benefits to the end user. The true cost of managing and using GPUs goes far beyond the hardware price, however. Understanding and managing these costs helps provide more efficient and productive systems.
Tags : 
     NVIDIA & Bright Computing
By: Intel     Published Date: Apr 01, 2016
Since its beginnings in 1999 as a project at Carnegie Mellon University, Lustre, the highperformance parallel file system, has come a long, long way. Designed and always focusing on performance and scalability, it is now part of nearly every High Performance Computing (HPC) cluster on the Top500.org’s list of fastest computers in the world—present in 70 percent of the top 100 and nine out of the top ten. That’s an achievement for any developer—or community of developers, in the case of Lustre—to be proud of.
Tags : 
     Intel
By: IBM     Published Date: Nov 14, 2014
A high performance computing (HPC) cluster refers to a group of servers built from off-the-shelf components that are connected via certain interconnect technologies. A cluster can deliver aggregated computing power from its many processors with many cores — sometimes hundreds, even thousands — to meet the processing demands of more complex engineering software, and therefore deliver results faster than individual workstations. If your company is in the majority that could benefit from access to more computing power, a cluster comprised of commodity servers may be a viable solution to consider, especially now that they’re easier to purchase, deploy, configure and maintain than ever before. Read more and learn about the '5 Easy Steps to a High Performance Cluster'.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Value is migrating throughout the IT industry from hardware to software and services. High Performance Computing (HPC) is no exception. IT solution providers must position themselves to maximize their delivery of business value to their clients – particularly industrial customers who often use several applications that must be integrated in a business workflow. This requires systems and hardware vendors to invest in making their infrastructure “application ready”. With its Application Ready solutions, IBM is outflanking competitors in Technical Computing and fast-tracking the delivery of client business value by providing an expertly designed, tightly integrated and performance optimized architecture for several key industrial applications. These Application Ready solutions come with a complete high-performance cluster including servers, network, storage, operating system, management software, parallel file systems and other run time libraries, all with commercial-level solution s
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
Swift Engineering wanted a solution where they spent more time solving complex problems than administering the system. Cray’s CX 1000 combined with Platform HPC allowed Swift to solve bigger problems with more enhanced graphics with real time speed.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Mar 30, 2015
Dell has teamed with Intel® to create innovative solutions that can accelerate the research, diagnosis and treatment of diseases through personalized medicine. The combination of leading-edge Intel® Xeon® processors and the systems and storage expertise from Dell create a state-of-the-art data center solution that is easy to install, manage and expand as required. Labelled the Dell Genomic Data Analysis Platform (GDAP), this solution is designed to achieve fast results with maximum efficiency. The solution is architected to solve a number of customer challenges, including the perception that implementation must be large-scale in nature, compliance, security and clinician uses.
Tags : 
     Dell and Intel®
By: Data Direct Networks     Published Date: Dec 31, 2015
Using high performance parallel storage solutions, geologists and researchers can now incorporate larger data sets and execute more seismic and reservoir simulations faster than ever before, enabling higher fidelity geological analysis and significantly reduced exploration risk. With high costs of exploration, oil and gas companies are increasingly turning to high performance DDN storage solutions to eliminate I/O bottlenecks, minimize risk and costs, while delivering a larger number of higher fidelity simulations in same time as traditional storage architectures.
Tags : 
     Data Direct Networks
By: Data Direct Networks     Published Date: Dec 31, 2015
Parallelism and direct memory access enable faster and more accurate SAS analytics using Remote Direct Memory Access based analytics and fast, scalable,external disk systems with massively parallel access to data, SAS analytics driven organizations can deliver timely and accurate execution for data intensive workfl ows such as risk management, while incorporating larger datasets than using traditional NAS.
Tags : 
     Data Direct Networks
By: HERE Technologies     Published Date: May 24, 2019
How secure is your company’s network? The rising frequency of employee network access is fast becoming one of the most prevalent and unmanaged risks to the protection of critical enterprise data. When coupled with increasingly sophisticated cyber-attacks, the possibility of a security breach of enterprise networks becomes more likely. As one of the world’s leading location platforms in 2018, HERE shares insights and solutions to preventing identity fraud. Discover the latest facts and statistics. Learn more about the use-case of location verification when logging into your company’s network. Download the infographic from HERE Technologies.
Tags : technology, mapping, location data
     HERE Technologies
By: Dell EMC     Published Date: May 08, 2019
IDC reports companies that modernize IT infrastructure for emerging technologies such as AI thrive and see results such as launching IT services faster and closing more deals. Access this comprehensive whitepaper from Dell and Intel® to learn more.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: May 09, 2019
In nearly all midmarket businesses, staff costs are the single biggest expense – meaning that productivity gains have a substantial and lasting impact on the bottom line. This Proaction Series report, commissioned by Dell, discusses technologies that deliver the fastest, clearest productivity benefits to midmarket businesses and will address: - What midmarket firms are looking to accomplish - Why they would invest in IT solutions to achieve success in this area - Required IT solutions to support process evolution Download this report for more information.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: May 09, 2019
.As midmarket companies pursue business growth, they must deliver differentiated products and experiences without compromising business or customer data. Crucial to this is having the most up-to date IT infrastructure to support the scale and complexity of a changing application landscape. Midmarket businesses (MBs) must modernize their data centers refreshing server infrastructure and automating their IT management processes. Those that do will propel business innovation and deliver superior customer experiences with secure, fast, and reliable business technology. Download this paper, commissioned by Dell, to learn more.
Tags : 
     Dell EMC
By: Red Hat APAC     Published Date: Jul 16, 2019
https://asset.madisonlogic.com/production/asset-approval/35341_78975_preview.html?v=1563344610178 Red Hat CTO for global service providers, Ian Hood, and TelecomTV talk about multi edge compute capacity and the fundamental need to have a common platform to deliver future services. ‘If you are not feeling some pain, you are not driving fast enough’, says Red Hat CTO, Global Service Provider, Ian Hood. The race to 5G is definitely on, and the use cases are clear. You can talk about multi edge compute capacity and other technical issues, but the fundamental thing is to have a common platform to deliver future services. We need to get to the point where IoT Everywhere, virtualized video and all the applications that come from new 5G services are delivered seamlessly. Even blockchain has a clear future within the telco environment where the world of eSIMs, secure roaming charges and identity management can alls be based on blockchain technology.
Tags : 
     Red Hat APAC
By: NetApp APAC     Published Date: Jul 08, 2019
All-flash storage is on a strong growth trajectory, but the industry is not swooning naively over the blazing fast I/O speeds. Most IT managers are talking a mature, step-by-step approach to all-flash adoption. Falling per-gigabyte prices are making the technology more commonplace. Solid state drives are no longer just for specialized, high-performing tasks. Before they leap, however, storage professionals want to understand the full business picture as they formulate a winning strategy for putting all-flash storage to use for more workloads. This paper offers insights for business success with all-flash storage based on IT Central Station reviews. Real users weigh in on what it takes to get the most out of the technology. It covers such aspects of flash storage as the need for simplicity and the importance of flexibility. The paper also looks at how to build a business case for all-flash and think through the implications of issues such as integration with existing infrastructure.
Tags : 
     NetApp APAC
By: TIBCO Software     Published Date: Jun 14, 2019
As recognized leader in master data management (MDM), and a pioneer in data asset management, TIBCO EBX™ software is an innovative, single solution for managing, governing, and consuming all your shared data assets. It includes all the enterprise class capabilities you need to create data management applications including user interfaces for authoring and data stewardship, workflow, hierarchy management, and data integration tools. And it provides an accurate, trusted view of business functions, insights, and decisions to empower better decisions and faster, smarter actions. Download this datasheet to learn: What makes EBX™ software unique Various capabilities of EBX software The data it manages
Tags : 
     TIBCO Software
By: Group M_IBM Q3'19     Published Date: Jun 25, 2019
To become more effective, enterprises must fast-track projects to digitally connect their organizations. Building value and providing compelling customer experiences at lower cost requires more than a quick technology fix; it demands a business and technological commitment to a new target operating model (TOM). This operating model should offer a way of running the organization that combines digital technologies and operational capabilities to achieve improvements in revenue, customer experience and cost. Enabling digital capabilities via the TOM is an ongoing process that requires DevOps skills and agile development techniques. This is easier to achieve with partners that have the requisite capabilities to help with the creation of new digital assets
Tags : 
     Group M_IBM Q3'19
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com