processing

Results 1 - 25 of 410Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Aug 22, 2014
Learn how to optimize codes for faster application performance with Intel® Xeon® Phi™ coprocessor.
Tags : application performance, intel® xeon® phi™ coprocessor
     Cray
By: Seagate     Published Date: Jan 26, 2016
Finding oil and gas has always been a tricky proposition, given that reserves are primarily hidden underground, and often as not, under the ocean as well. The costs involved in acquiring rights to a site, drilling the wells, and operating them are considerable and has driven the industry to adopt advanced technologies for locating the most promising sites. As a consequence, oil and gas exploration today is essentially an exercise in scientific visualization and modeling, employing some of most advanced computational technologies available. High performance computing (HPC) systems are being used to fill these needs, primarily with x86-based cluster computers and Lustre storage systems. The technology is well developed, but the scale of the problem demands medium to large-sized systems, requiring a significant capital outlay and operating expense. The most powerful systems deployed by oil and gas companies are represented by petaflop-scale computers with multiple petabytes of attached
Tags : 
     Seagate
By: Altair     Published Date: Jul 15, 2014
NSCEE’s new workload management solution including PBS Professional has reduced overall runtimes for processing workload. Furthermore, NSCEE credits PBS Professional with improving system manageability and extensibility thanks to these key features: • Lightweight solution • Very easy to manage • Not dependent on any specific operating system • Can be easily extended by adding site-specific processing plugins/hooks To learn more click to read the full paper
Tags : 
     Altair
By: IBM     Published Date: Nov 14, 2014
A high performance computing (HPC) cluster refers to a group of servers built from off-the-shelf components that are connected via certain interconnect technologies. A cluster can deliver aggregated computing power from its many processors with many cores — sometimes hundreds, even thousands — to meet the processing demands of more complex engineering software, and therefore deliver results faster than individual workstations. If your company is in the majority that could benefit from access to more computing power, a cluster comprised of commodity servers may be a viable solution to consider, especially now that they’re easier to purchase, deploy, configure and maintain than ever before. Read more and learn about the '5 Easy Steps to a High Performance Cluster'.
Tags : 
     IBM
By: AMD     Published Date: Nov 09, 2015
Graphics Processing Units (GPUs) have become a compelling technology for High Performance Computing (HPC), delivering exceptional performance per watt and impressive densities for data centers. AMD has partnered up with Hewlett Packard Enterprise to offer compelling solutions to drive your HPC workloads to new levels of performance. Learn about the awe-inspiring performance and energy efficiency of the AMD FirePro™ S9150, found in multiple HPE servers including the popular 2U HPE ProLiant DL380 Gen9 server. See why open standards matter for HPC, and what AMD is doing in this area. Click here to read more on AMD FirePro™ server GPUs for HPE Proliant servers
Tags : 
     AMD
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: Ricoh     Published Date: Oct 02, 2018
Your business is changing. As a finance leader, you know that accounting is a labour-intensive, costly process where systems often don’t allow for expedient exception handling and many days are fraught with difficulty in matching invoices to other databases for reconciliation. Like most companies, you know where you want to go but may not have infrastructure or internal expertise to handle electronic fund transfers, credit card payments or cheque processing— all the pieces required to make your vision for an efficient, integrated operation a reality.
Tags : 
     Ricoh
By: Cognizant     Published Date: Oct 23, 2018
A group of emerging technologies is rapidly creating numerous opportunities for life sciences companies to improve productivity, enhance patient care and ensure regulatory compliance. These technologies include robotic process automation (RPA), artificial intelligence (AI), machine learning (ML), blockchain, the Internet of Things (IoT), 3-D printing and augmented reality/ virtual reality (AR/ VR). This whitepaper presents a preview of five pivotal technology trends remaking the life sciences industry: AI and automation, human augmentation, edge analytics/ processing, data ownership and protection, and the intermingling of products and services.
Tags : cognizant, life sciences, patient care
     Cognizant
By: Group M_IBM Q418     Published Date: Oct 15, 2018
The enterprise data warehouse (EDW) has been at the cornerstone of enterprise data strategies for over 20 years. EDW systems have traditionally been built on relatively costly hardware infrastructures. But ever-growing data volume and increasingly complex processing have raised the cost of EDW software and hardware licenses while impacting the performance needed for analytic insights. Organizations can now use EDW offloading and optimization techniques to reduce costs of storing, processing and analyzing large volumes of data. Getting data governance right is critical to your business success. That means ensuring your data is clean, of excellent quality, and of verifiable lineage. Such governance principles can be applied in Hadoop-like environments. Hadoop is designed to store, process and analyze large volumes of data at significantly lower cost than a data warehouse. But to get the return on investment, you must infuse data governance processes as part of offloading.
Tags : 
     Group M_IBM Q418
By: Group M_IBM Q418     Published Date: Oct 23, 2018
The General Data Protection Regulation (GDPR) framework seeks to create a harmonized data protection framework across the European Union, and aims to give back EU citizens control of their personal data by imposing stricter requirements for those hosting and processing this data, anywhere in the world. IBM is committed to putting data responsibility first and providing solutions that are secure to the core for all customers. As such, IBM Cloud has fully adopted the EU Data Protection Code of Conduct for Cloud Service providers – meaning we agree to meet the entirety of its stringent requirements.
Tags : 
     Group M_IBM Q418
By: DocuSign UK     Published Date: Aug 08, 2018
"Many financial services firms have automated the vast majority of key processes and customer experiences. However, the “last mile” of most transactions – completing the agreement– far too often relies on the same inefficient pen-and-paper processes of yesteryear. Digitising agreements using DocuSign lets you keep processes digital from end to end. Completing transactions no longer requires documents to be printed and shipped, and re-keyed on the back end. Read the whitepaper to learn how leading financial services organisations use straight-through processing by automating the last mile of business transactions to: - Speed processes by 80% or more, often going from days or weeks to just minutes - Reduce NIGO by anywhere from 55% to 93% - Achieve a 300% average ROI "
Tags : 
     DocuSign UK
By: TIBCO Software EMEA     Published Date: Sep 12, 2018
By processing real-time data from machine sensors using artificial intelligence and machine learning, it’s possible to predict critical events and take preventive action to avoid problems. TIBCO helps manufacturers around the world predict issues with greater accuracy, reduce downtime, increase quality, and improve yield. Read about our top data science best practices for becoming a smart manufacturer.
Tags : inter-company connectivity, real-time tracking, automate analytic models, efficient analytics, collaboration
     TIBCO Software EMEA
By: NEC     Published Date: Sep 29, 2009
Written by: IDC Abner Germanow, Jonathan Edwards, Lee Doyle IDC believes the convergence of communications and mainstream IT architectures will drive significant innovation in business processes over the next decade.
Tags : it architecture, idc, automation, automated order processing, soa, service oriented architecture, soap, http, xml, wsdl, uddi, esbs, jee, .net, crm, san
     NEC
By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : vision, high availability, ibm, aix, cdp, core union
     Vision Solutions
By: SAS     Published Date: Aug 17, 2018
This SAS and Intel collaborated piece demonstrates the value of modernizing your analytics infrastructure using SAS® software on Intel processing. Readers will learn: • Benefits of applying a consistent analytic vision across all functions within the organization to make more insight-driven decisions. • How IT plays a pivotal role in modernizing analytics infrastructures. • Competitive advantages of modern analytics.
Tags : 
     SAS
By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
     SAP
By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
     SAP
By: Cisco EMEA     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
     Cisco EMEA
By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
     SAP
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : database usage, database management, server usage, data protection
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : cost reduction, oracle database, it operation, online transaction, online analytics
     Hewlett Packard Enterprise
By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
     Cisco
By: Pentaho     Published Date: Feb 26, 2015
This eBook from O’Reilly Media will help you navigate the diverse and fast-changing landscape of technologies for processing and storing data (NoSQL, big data, MapReduce, etc).
Tags : data systems, data-intensive applications, scalability, maintainability, data storage, application development
     Pentaho
By: OpenText     Published Date: Mar 02, 2017
Watch the video to learn how Procure-to-Pay (P2P) solutions automate B2B processes to help you gain better visibility into transaction lifecycles, improve efficiency, and increase the speed and accuracy of order, shipping, and invoice processing.
Tags : supply chain, b2b, procure-to-pay, shipping, invoicing
     OpenText
By: Amazon Web Services     Published Date: Sep 05, 2018
Big data alone does not guarantee better business decisions. Often that data needs to be moved and transformed so Insight Platforms can discern useful business intelligence. To deliver those results faster than traditional Extract, Transform, and Load (ETL) technologies, use Matillion ETL for Amazon Redshift. This cloud- native ETL/ELT offering, built specifically for Amazon Redshift, simplifies the process of loading and transforming data and can help reduce your development time. This white paper will focus on approaches that can help you maximize your investment in Amazon Redshift. Learn how the scalable, cloud- native architecture and fast, secure integrations can benefit your organization, and discover ways this cost- effective solution is designed with cloud computing in mind. In addition, we will explore how Matillion ETL and Amazon Redshift make it possible for you to automate data transformation directly in the data warehouse to deliver analytics and business intelligence (BI
Tags : 
     Amazon Web Services
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com