data management

Results 1 - 25 of 2321Sort Results By: Published Date | Title | Company Name
By: Altair     Published Date: Feb 19, 2014
PBS Works™, Altair's suite of on-demand cloud computing technologies, allows enterprises to maximize ROI on existing infrastructure assets. PBS Works is the most widely implemented software environment for managing grid, cloud, and cluster computing resources worldwide. The suite’s flagship product, PBS Professional®, allows enterprises to easily share distributed computing resources across geographic boundaries. With additional tools for portal-based submission, analytics, and data management, the PBS Works suite is a comprehensive solution for optimizing HPC environments. Leveraging a revolutionary “pay-for-use” unit-based business model, PBS Works delivers increased value and flexibility over conventional software-licensing models.
Tags : 
     Altair
By: IBM     Published Date: Jun 05, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : ibm
     IBM
By: IBM     Published Date: Sep 02, 2014
Learn how GPFS accelerates data intensive work flows and lowers storage costs in Life Sciences, Energy Exploration, Government, Media & Entertainment and Financial Services by removing data related bottlenecks, simplifying data management at scale, empowering global collaboration, managing the full data life cycle cost effectively and ensuring end-to-end data availability, reliability, and integrity.
Tags : ibm, complete storage solution, gpfs
     IBM
By: IBM     Published Date: Sep 02, 2014
Whether engaged in genome sequencing, drug design, product analysis or risk management, life sciences research needs high-performance technical environments with the ability to process massive amounts of data and support increasingly sophisticated simulations and analyses. Learn how IBM solutions such as IBM® Platform Computing™ high-performance cluster, grid and high-performance computing (HPC) cloud management software can help life sciences organizations transform and integrate their compute environments to develop products better, faster and at less expense.
Tags : ibm, life sciences, platform computing
     IBM
By: IBM     Published Date: Sep 02, 2014
With tougher regulations and continuing market volatility, financial firms are moving to active risk management with a focus on counterparty risk. Firms are revamping their risk and trading practices from top to bottom. They are adopting new risk models and frameworks that support a holistic view of risk. Banks recognize that technology is critical for this transformation, and are adding state-of-the-art enterprise risk management solutions, high performance data and grid management software, and fast hardware. Join IBM Algorithmics and IBM Platform Computing to gain insights on this trend and on technologies for enabling active "real-time" risk management.
Tags : 
     IBM
By: IBM     Published Date: Sep 02, 2014
This brief webcast will cover the new and enhanced capabilities of Elastic Storage 4.1, including native encryption and secure erase, flash-accelerated performance, network performance monitoring, global data sharing, NFS data migration and more. IBM GPFS (Elastic storage) may be the key to improving your organization's effectiveness and can help define a clear data management strategy for future data growth and support.
Tags : ibm, elastic storage
     IBM
By: IBM     Published Date: Sep 16, 2015
A fast, simple, scalable and complete storage solution for today’s data-intensive enterprise IBM Spectrum Scale is used extensively across industries worldwide. Spectrum Scale simplifies data management with integrated tools designed to help organizations manage petabytes of data and billions of files—as well as control the cost of managing these ever-growing data volumes.
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: Adaptive Computing     Published Date: Feb 21, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: IBM     Published Date: Nov 14, 2014
This book examines data storage and management challenges and explains software-defined storage, an innovative solution for high-performance, cost-effective storage using the IBM General Parallel File System (GPFS).
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: IBM     Published Date: Feb 13, 2015
To quickly and economically meet new and peak demands, Platform LSF (SaaS) and Platform Symphony (SaaS) workload management as well as Elastic Storage on Cloud data management software can be delivered as a service, complete with SoftLayer cloud infrastructure and 24x7 support for technical computing and service-oriented workloads. Watch this demonstration to learn how the IBM Platform Computing Cloud Service can be used to simplify and accelerate financial risk management using IBM Algorithmics.
Tags : 
     IBM
By: Dell and Intel®     Published Date: Nov 18, 2015
Unleash the extreme performance and scalability of the Lustre® parallel file system for high performance computing (HPC) workloads, including technical ‘big data’ applications common within today’s enterprises. The Dell Storage for HPC with Intel® Enterprise Edition (EE) for Lustre Solution allows end-users that need the benefits of large–scale, high bandwidth storage to tap the power and scalability of Lustre, with its simplified installation, configuration, and management features that are backed by Dell and Intel®.
Tags : 
     Dell and Intel®
By: Penguin Computing     Published Date: Oct 14, 2015
IT organizations are facing increasing pressure to deliver critical services to their users while their budgets are either reduced or maintained at current levels. New technologies have the potential to deliver industry-changing information to users who need data in real time, but only if the IT infrastructure is designed and implemented to do so. While computing power continues to decline in cost, the management of large data centers, together with the associated costs of running these data centers, increases. The server administration over the life of the computer asset will consume about 75 percent of the total cost.
Tags : 
     Penguin Computing
By: Avere Systems     Published Date: Jun 27, 2016
This white paper reviews common HPC-environment challenges and outlines solutions that can help IT professionals deliver best-in-class HPC cloud solutions—without undue stress and organizational chaos. The paper: • Identifies current issues—including data management, data center limitations, user expectations, and technology shifts- that stress IT teams and existing infrastructure across industries and HPC applications • Describes the potential cost savings, operational scale, and new functionality that cloud solutions can bring to big compute • Characterizes technical and other barriers to an all cloud infrastructure and describes how IT teams can leverage a hybrid cloud for compute power, maximum flexibility, and protection against locked-in scenarios
Tags : 
     Avere Systems
By: Data Direct Networks     Published Date: Dec 31, 2015
Parallelism and direct memory access enable faster and more accurate SAS analytics using Remote Direct Memory Access based analytics and fast, scalable,external disk systems with massively parallel access to data, SAS analytics driven organizations can deliver timely and accurate execution for data intensive workfl ows such as risk management, while incorporating larger datasets than using traditional NAS.
Tags : 
     Data Direct Networks
By: Amazon Web Services     Published Date: Jul 25, 2018
What is a Data Lake? Today’s organizations are tasked with managing multiple data types, coming from a wide variety of sources. Faced with massive volumes and heterogeneous types of data, organizations are finding that in order to deliver insights in a timely manner, they need a data storage and analytics solution that offers more agility and flexibility than traditional data management systems. Data Lakes are a new and increasingly popular way to store and analyze data that addresses many of these challenges. A Data Lakes allows an organization to store all of their data, structured and unstructured, in one, centralized repository. Since data can be stored as-is, there is no need to convert it to a predefined schema and you no longer need to know what questions you want to ask of your data beforehand. Download to find out more now.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
Organizations are collecting and analyzing increasing amounts of data making it difficult for traditional on-premises solutions for data storage, data management, and analytics to keep pace. Amazon S3 and Amazon Glacier provide an ideal storage solution for data lakes. They provide options such as a breadth and depth of integration with traditional big data analytics tools as well as innovative query-in-place analytics tools that help you eliminate costly and complex extract, transform, and load processes. This guide explains each of these options and provides best practices for building your Amazon S3-based data lake.
Tags : 
     Amazon Web Services
By: Aberdeen     Published Date: Jun 17, 2011
Download this paper to learn the top strategies leading executives are using to take full advantage of the insight they receive from their business intelligence (BI) systems - and turn that insight into a competitive weapon.
Tags : aberdeen, michael lock, data-driven decisions, business intelligence, public sector, analytics, federal, state, governmental, decisions, data management
     Aberdeen
By: CA Technologies EMEA     Published Date: Sep 07, 2018
As the application economy drives companies to roll out applications more quickly, companies are seeing testing in a new light. Once considered a speed bump on the DevOps fast track, new tools and testing methodologies are emerging to bring testing up to speed. In this ebook, we’ll explore some of the challenges on the road to continuous testing, along with new approaches that will help you adopt next-gen testing practices that offer the ability to test early, often and automatically.
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 07, 2018
There are five ways to provision test data. You can copy or take a snapshot of your production database or databases. You can provision data manually or via a spreadsheet. You can derive virtual copies of your production database(s). You can generate subsets of your production database(s). And you can generate synthetic data that is representative of your production data but is not actually real. Of course, the first four examples assume that the data you need for testing purposes is available to you from your production databases. If this is not the case, then only manual or synthetic data provision is a viable option. Download this whitepaper to find out more about how CA Technologies can help your business and its Test Data problems.
Tags : 
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 07, 2018
With the application economy in full swing, more organizations are turning to Continuous Testing and DevOps development practices in order to quickly roll out applications that reflect the ever-changing needs of tech-savvy, experience-driven consumers. Rigorous data they need, in the right formats. This forces teams to postpone their testing until the next sprint. As a result, organizations like yours are increasingly looking for ways to overcome the challenges of poor quality data and slow, manual data provisioning. They are also concerned about compliance and data privacy when using sensitive information for testing. CA Test Data Manager can help you mitigate all these concerns, so you’re positioned to achieve real cost savings.
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 12, 2018
To compete successfully in today’s economy, companies from all industries require the ability to deliver software faster, with higher quality, and reduced risk and costs. This is only possible with a modern software factory that can deliver quality software continuously. Yet for most enterprises, testing has not kept pace with modern development methodologies. A new approach to software testing is required: Continuous Testing. In the first session in a series, join product management leadership to gain in-depth insights on how by shifting testing left, and automating all aspects of test case generation and execution, continuous testing, it enables you to deliver quality software faster than ever. Recorded Feb 5 2018 49 mins Presented by Steve Feloney, VP Product Management CA Technologies
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 30, 2018
If you’re relying on manual processes for testing applications, artificial and automated intelligence (AI) and machine learning (ML) can help you build more efficient continuous frameworks for quality delivery. In this on-demand webinar, “Continuous Intelligent Testing: Applying AI and ML to Your Testing Practices,” you’ll learn how to: Use AI and ML as the new, necessary approach for testing intelligent applications. Strategically apply AI and ML to your testing practices. Identify the tangible benefits of continuous intelligent testing. Reduce risk while driving test efficiency and improvement. This webinar offers practical steps to applying AI and ML to your app testing. The speaker, Jeff Scheaffer, is senior vice president and general manager of the Continuous Delivery Business Unit at CA Technologies. His specialties include DevOps, Mobility, Software as a Service (SaaS) and Continuous Delivery (CDCI).
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
By: CA Technologies EMEA     Published Date: Sep 10, 2018
Companies struggle to find the right test data when testing applications which leads to bottlenecks, defects and constant delays. There is a better way and we want to show you how: Join us for this webcast to learn: - How Test Data Manager finds, builds, protects and delivers test data fast! - How to get your testing teams moving towards self sufficiency with test data Get your questions answered. Come away happy! Recorded Aug 20 2018 60 mins Presented by Prashant Pandey, CA Technologies
Tags : continuous delivery, application delivery, testing, test data management
     CA Technologies EMEA
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com