processing

Results 1 - 25 of 374Sort Results By: Published Date | Title | Company Name
By: Cray     Published Date: Aug 22, 2014
Learn how to optimize codes for faster application performance with Intel® Xeon® Phi™ coprocessor.
Tags : application performance, intel® xeon® phi™ coprocessor
     Cray
By: Seagate     Published Date: Jan 26, 2016
Finding oil and gas has always been a tricky proposition, given that reserves are primarily hidden underground, and often as not, under the ocean as well. The costs involved in acquiring rights to a site, drilling the wells, and operating them are considerable and has driven the industry to adopt advanced technologies for locating the most promising sites. As a consequence, oil and gas exploration today is essentially an exercise in scientific visualization and modeling, employing some of most advanced computational technologies available. High performance computing (HPC) systems are being used to fill these needs, primarily with x86-based cluster computers and Lustre storage systems. The technology is well developed, but the scale of the problem demands medium to large-sized systems, requiring a significant capital outlay and operating expense. The most powerful systems deployed by oil and gas companies are represented by petaflop-scale computers with multiple petabytes of attached
Tags : 
     Seagate
By: Altair     Published Date: Jul 15, 2014
NSCEE’s new workload management solution including PBS Professional has reduced overall runtimes for processing workload. Furthermore, NSCEE credits PBS Professional with improving system manageability and extensibility thanks to these key features: • Lightweight solution • Very easy to manage • Not dependent on any specific operating system • Can be easily extended by adding site-specific processing plugins/hooks To learn more click to read the full paper
Tags : 
     Altair
By: IBM     Published Date: Nov 14, 2014
A high performance computing (HPC) cluster refers to a group of servers built from off-the-shelf components that are connected via certain interconnect technologies. A cluster can deliver aggregated computing power from its many processors with many cores — sometimes hundreds, even thousands — to meet the processing demands of more complex engineering software, and therefore deliver results faster than individual workstations. If your company is in the majority that could benefit from access to more computing power, a cluster comprised of commodity servers may be a viable solution to consider, especially now that they’re easier to purchase, deploy, configure and maintain than ever before. Read more and learn about the '5 Easy Steps to a High Performance Cluster'.
Tags : 
     IBM
By: AMD     Published Date: Nov 09, 2015
Graphics Processing Units (GPUs) have become a compelling technology for High Performance Computing (HPC), delivering exceptional performance per watt and impressive densities for data centers. AMD has partnered up with Hewlett Packard Enterprise to offer compelling solutions to drive your HPC workloads to new levels of performance. Learn about the awe-inspiring performance and energy efficiency of the AMD FirePro™ S9150, found in multiple HPE servers including the popular 2U HPE ProLiant DL380 Gen9 server. See why open standards matter for HPC, and what AMD is doing in this area. Click here to read more on AMD FirePro™ server GPUs for HPE Proliant servers
Tags : 
     AMD
By: HPE     Published Date: Jul 21, 2016
Deep learning is a method of creating artificial intelligence systems that combine computer-based multi-layer neural networks with intensive training techniques and large data sets to enable analysis and predictive decision making. A fundamental aspect of deep learning environments is that they transcend finite programmable constraints to the realm of extensible and trainable systems. Recent developments in technology and algorithms have enabled deep learning systems to not only equal but to exceed human capabilities in the pace of processing vast amounts of information
Tags : 
     HPE
By: This program is brought to you by Oracle and Intel     Published Date: Mar 15, 2018
Getting the most from your data, driving innovation, processing orders faster, and reducing operating costs are all essential. And Oracle Exadata is the database platform to deliver these improvements. Read five top reasons for running your business on Oracle Exadata, and find out why other organisations say it was such a good choice for them.
Tags : 
     This program is brought to you by Oracle and Intel
By: Sage     Published Date: May 15, 2018
The Sage Pay@Table restaurant experience As your business expands, it’s time to improve your payment system so that staff can reduce waiting times by processing payments on- the-go. Find out why Sage Pay@Table is the new waiter’s friend.
Tags : 
     Sage
By: Workday     Published Date: Jan 16, 2018
Financial transformation by definition is not something you can bolt on—it requires a willingness to question long-held assumptions and envision where you want to go and a total technology rethink. In the next blog, we’ll take a closer look at how one, unified, cloud-based system can create the perfect environment for finance to handle transaction processing and compliance and control while delivering the answers the business needs.
Tags : financial, technology, innovation, optimization, business, workday
     Workday
By: NEC     Published Date: Sep 29, 2009
Written by: IDC Abner Germanow, Jonathan Edwards, Lee Doyle IDC believes the convergence of communications and mainstream IT architectures will drive significant innovation in business processes over the next decade.
Tags : it architecture, idc, automation, automated order processing, soa, service oriented architecture, soap, http, xml, wsdl, uddi, esbs, jee, .net, crm, san
     NEC
By: Vision Solutions     Published Date: Feb 18, 2008
Continuous member service is an important deliverable for credit unions, and. the continued growth in assets and members means that the impact of downtime is affecting a larger base and is therefore potentially much more costly. Learn how new data protection and recovery technologies are making a huge impact on downtime for credit unions that depend on AIX-hosted applications.
Tags : vision, high availability, ibm, aix, cdp, core union
     Vision Solutions
By: SAP     Published Date: Feb 03, 2017
The SAP HANA platform provides a powerful unified foundation for storing, processing, and analyzing structured and unstructured data. It funs on a single, in-memory database, eliminating data redundancy and speeding up the time for information research and analysis.
Tags : 
     SAP
By: SAP     Published Date: Feb 03, 2017
The spatial analytics features of the SAP HANA platform can help you supercharge your business with location-specific data. By analyzing geospatial information, much of which is already present in your enterprise data, SAP HANA helps you pinpoint events, resolve boundaries locate customers and visualize routing. Spatial processing functionality is standard with your full-use SAP HANA licenses.
Tags : 
     SAP
By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
     Cisco EMEA Tier 3 ABM
By: SAP     Published Date: May 18, 2014
In-memory technology—in which entire datasets are pre-loaded into a computer’s random access memory, alleviating the need for shuttling data between memory and disk storage every time a query is initiated—has actually been around for a number of years. However, with the onset of big data, as well as an insatiable thirst for analytics, the industry is taking a second look at this promising approach to speeding up data processing.
Tags : sap, big data, real time data, in memory technology, data warehousing, analytics, big data analytics, data management, business insights, architecture, business intelligence, big data tools
     SAP
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : database usage, database management, server usage, data protection
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : cost reduction, oracle database, it operation, online transaction, online analytics
     Hewlett Packard Enterprise
By: Cisco     Published Date: Jun 21, 2016
The Cisco® Hyperlocation Solution is the industry’s first Wi-Fi network-based location system that can help businesses and users pinpoint a user’s location to within one to three meters, depending on the deployment. Combining innovative RF antenna and module design, faster and more frequent data processing, and a powerful platform for customer engagement, it can help businesses create more personalized and profitable customer experiences.
Tags : 
     Cisco
By: Pentaho     Published Date: Feb 26, 2015
This eBook from O’Reilly Media will help you navigate the diverse and fast-changing landscape of technologies for processing and storing data (NoSQL, big data, MapReduce, etc).
Tags : data systems, data-intensive applications, scalability, maintainability, data storage, application development
     Pentaho
By: OpenText     Published Date: Mar 02, 2017
Watch the video to learn how Procure-to-Pay (P2P) solutions automate B2B processes to help you gain better visibility into transaction lifecycles, improve efficiency, and increase the speed and accuracy of order, shipping, and invoice processing.
Tags : supply chain, b2b, procure-to-pay, shipping, invoicing
     OpenText
By: Oracle CX     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
     Oracle CX
By: IBM APAC     Published Date: Apr 27, 2018
While relying on x86 servers and Oracle databases to support their stock trading systems, processing rapidly increasing number of transactions fast became a huge challenge for Wanlian Securities. They shifted to IBM FlashSystem that helped them cut average response time for their Oracle Databases from 10 to less than 0.4 milliseconds and improved CPU usage by 15%. Download this case study now.
Tags : 
     IBM APAC
By: Dome9     Published Date: Apr 25, 2018
As of May 2017, according to a report from The Depository Trust & Clearing Corporation (DTCC), which provides financial transaction and data processing services for the global financial industry, cloud computing has reached a tipping point1. Today, financial services companies can benefit from the capabilities and cost efficiencies of the cloud. In October of 2016, the Federal Deposit Insurance Corporation (FDIC), the Office of the Comptroller of Currency (OCC) and the Federal Reserve Board (FRB) jointly announced enhanced cyber risk management standards for financial institutions in an Advanced Notice of Proposed Rulemaking (ANPR)2. These proposed standards for enhanced cybersecurity are aimed at protecting the entire financial system, not just the institution. To meet these new standards, financial institutions will require the right cloud-based network security platform for comprehensive security management, verifiable compliance and governance and active protection of customer data
Tags : 
     Dome9
By: IBM APAC     Published Date: Nov 22, 2017
Using IBM Watson’s cognitive capabilities, companies can quickly differentiate their customer service quality by being more pro active and responsive to customer needs. Simply put, chatbots and virtual agents are the future of customer interactions. Building apps from scratch that incorporate natural language processing, speech to text recognition, visual recognition, analytics, and artificial intelligence requires broad expertise in these disciplines, large staffs, and a huge financial commitment. Making use of IBM Watson cognitive services brings these capabilities in-house quickly and without the capital investment that would be needed to develop the technologies within an organization.
Tags : decision making, deeper data, insights, cognitive, analytics, ibm, watson, virtual agents
     IBM APAC
By: IBM APAC     Published Date: Nov 22, 2017
AlchemyAPI’s approach to natural language processing incorporates both linguistic and statistical analysis techniques into a single unified system. This hybrid approach provides an industry-leading advantage since both techniques have benefits and drawbacks depending on the content and specific usecases. Linguistic analysis takes a basic grammatical approach to understand how words combine into phrases, and how those phrases combine into sentences. While this approach works well with editorialized text (e.g., news articles and press releases), it does not perform as well when it comes to usergenerated content, often filled with slang, misspellings and idioms. Statistical analysis, however, understands language from a mathematical standpoint and works well on “noisy” content (e.g., tweets, blog posts, and Facebook status updates). The combination of these two approaches allows for increased accuracy on a variety of content.
Tags : industry, advantage, linguistic, grammatical, statistical analysis, content
     IBM APAC
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com