database transaction

Results 1 - 25 of 34Sort Results By: Published Date | Title | Company Name
By: Attunity     Published Date: Feb 12, 2019
Read this technical whitepaper to learn how data architects and DBAs can avoid the struggle of complex scripting for Kafka in modern data environments. You’ll also gain tips on how to avoid the time-consuming hassle of manually configuring data producers and data type conversions. Specifically, this paper will guide you on how to overcome these challenges by leveraging innovative technology such as Attunity Replicate. The solution can easily integrate source metadata and schema changes for automated configuration real-time data feeds and best practices.
Tags : data streaming, kafka, metadata integration, metadata, data streaming, apache kafka, data integration, data analytics
     Attunity
By: Cisco Umbrella EMEA     Published Date: Oct 12, 2018
Protect the network, use the network to protect students Your schools use a growing amount of networked information and devices as an essential part of teaching and administration. Research on the Internet, laptops in class, distance learning, and online homework are some of the official technologies that your students use. Unofficially, this list extends to mobile phones (with cameras), digital music and video players, blogs, instant messaging, social networking sites, and networked or online gaming. Behind the scenes are databases with student information, class schedules, attendance records, copies of exams and answers, financial transactions, and even video surveillance. All of these rely on a robust and secure network infrastructure. Cisco Umbrella can provides these, download this whitepaper to find out more!
Tags : 
     Cisco Umbrella EMEA
By: Dell PC Lifecycle     Published Date: Mar 09, 2018
In the end, the Dell EMC VMAX 250F with Intel® Xeon® Processor All Flash storage array lived up to its promises better than the HPE 3PAR 8450 Storage array did. We experience minimal impact to database performance when the VMAX 250F processed transactional and data mart loading at the same time. This is useful whether you're performing extensive backups or compiling large amounts of data from multiple sources. Intel Inside®. New Possibilities Outside.
Tags : 
     Dell PC Lifecycle
By: Group M_IBM Q119     Published Date: Dec 18, 2018
IBM LinuxONE™ is an enterprise Linux server engineered to deliver cloud services that are secure, fast and instantly scalable. The newest member of the family, IBM LinuxONE Emperor™ II, is designed for businesses where the following may be required: • protecting sensitive transactions and minimizing business risk • accelerating the movement of data, even with the largest databases • growing users and transactions instantly while maintaining operational excellence • accessing an open platform that speeds innovation
Tags : 
     Group M_IBM Q119
By: Group M_IBM Q418     Published Date: Sep 10, 2018
IBM LinuxONE™ is an enterprise Linux server engineered to deliver cloud services that are secure, fast and instantly scalable. The newest member of the family, IBM LinuxONE Emperor™ II, is designed for businesses where the following may be required: • protecting sensitive transactions and minimizing business risk • accelerating the movement of data, even with the largest databases • growing users and transactions instantly while maintaining operational excellence • accessing an open platform that speeds innovation
Tags : 
     Group M_IBM Q418
By: Group M_IBM Q418     Published Date: Dec 18, 2018
IBM LinuxONE™ is an enterprise Linux server engineered to deliver cloud services that are secure, fast and instantly scalable. The newest member of the family, IBM LinuxONE Emperor™ II, is designed for businesses where the following may be required: • protecting sensitive transactions and minimizing business risk • accelerating the movement of data, even with the largest databases • growing users and transactions instantly while maintaining operational excellence • accessing an open pla
Tags : 
     Group M_IBM Q418
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
In midsize and large organizations, critical business processing continues to depend on relational databases including Microsoft® SQL Server. While new tools like Hadoop help businesses analyze oceans of Big Data, conventional relational-database management systems (RDBMS) remain the backbone for online transaction processing (OLTP), online analytic processing (OLAP), and mixed OLTP/OLAP workloads.
Tags : database usage, database management, server usage, data protection
     Hewlett Packard Enterprise
By: Hewlett Packard Enterprise     Published Date: Aug 02, 2017
What if you could reduce the cost of running Oracle databases and improve database performance at the same time? What would it mean to your enterprise and your IT operations? Oracle databases play a critical role in many enterprises. They’re the engines that drive critical online transaction (OLTP) and online analytical (OLAP) processing applications, the lifeblood of the business. These databases also create a unique challenge for IT leaders charged with improving productivity and driving new revenue opportunities while simultaneously reducing costs.
Tags : cost reduction, oracle database, it operation, online transaction, online analytics
     Hewlett Packard Enterprise
By: IBM     Published Date: Dec 06, 2013
Partners and customers expect instantaneous response and continuous uptime from data systems, but the volume and velocity of data make it difficult to respond with agility. IBM PureData System for Transactions enable businesses to gear up and meet these challenges.
Tags : ibm, ibm puredata system, data, data mangement, database, integration, transactions, workload
     IBM
By: IBM     Published Date: Oct 13, 2016
Compare IBM DB2 pureScale with any other offering being considered for implementing a clustered, scalable database configuration see how they deliver continuous availability and why they are important. Download now!
Tags : data. queries, database operations, transactional databases, clustering, it management, storage
     IBM
By: IBM     Published Date: Jul 21, 2016
IBM's recently released DB2 version 11.1 for Linux, Unix and Windows (LUW) is a hybrid database that IBM says can handle transactional and analytic workloads thanks to its BLU Acceleration technology, which features an in-memory column store for analytical workloads that can scale across a massively parallel cluster.
Tags : ibm, db2. analytics, mpp, data wharehousing
     IBM
By: IBM     Published Date: Jul 06, 2017
DB2 is a proven database for handling the most demanding transactional workloads. But the trend as of late is to enable relational databases to handle analytic queries more efficiently by adding an inmemory column store alongside to aggregate data and provide faster results. IBM's BLU Acceleration technology does exactly that. While BLU isn't brand new, the ability to spread the column store across a massively parallel processing (MPP) cluster of up to 1,000 nodes is a new addition to the technology. That, along with simpler monthly pricing options and integration with dashDB data warehousing in the cloud, makes DB2 for LUW, a very versatile database.
Tags : memory analytics, database, efficiency, acceleration technology, aggregate data
     IBM
By: IBM     Published Date: Jun 29, 2018
IBM LinuxONE™ is an enterprise Linux server engineered to deliver cloud services that are secure, fast and instantly scalable. The newest member of the family, IBM LinuxONE Emperor™ II, is designed for businesses where the following may be required: • protecting sensitive transactions and minimizing business risk • accelerating the movement of data, even with the largest databases • growing users and transactions instantly while maintaining operational excellence • accessing an open platform that speeds innovation
Tags : 
     IBM
By: IBM APAC     Published Date: Apr 27, 2018
While relying on x86 servers and Oracle databases to support their stock trading systems, processing rapidly increasing number of transactions fast became a huge challenge for Wanlian Securities. They shifted to IBM FlashSystem that helped them cut average response time for their Oracle Databases from 10 to less than 0.4 milliseconds and improved CPU usage by 15%. Download this case study now.
Tags : 
     IBM APAC
By: Infosys     Published Date: Jun 14, 2019
The basic promise of blockchain is that it works as a global distributed ledger or database of things we value - everything from money, to creative work and votes - so these can then be managed and transacted in a trusted and secure manner. Yes, blockchain is a young technology, but well past its infancy. Today, the somewhat inflated expectations around the technology have been brought to a more realistic level and we are beginning to see practical applications across the enterprise. Although these are typically in the realm of reducing cost and/or time for transactions, improving product and system security, finding and preventing fraud and counterfeiting and increasing transparency, there are also instances of businesses relying on blockchain to create new revenue streams. Infosys recently commissioned an independent survey of senior executives worldwide. Download the report to read about their findings.
Tags : 
     Infosys
By: Lumension     Published Date: Feb 07, 2014
Being able to compare the impact of application control and traditional anti-virus solutions could help you improve your server's performance and security.
Tags : lumension, server performance, server security, application control, anti-virus solutions, endpoints, application control solution, malware
     Lumension
By: McAfee     Published Date: Mar 08, 2013
Protect sensitive information in emerging computing models such as virtualized environments. Learn how to leverage the same security architecture to provide more effective and more efficient data security across dedicated database servers as well.
Tags : database security, virtual database environments, database virtualization, database activity monitoring, vm to vm transactions, virtual machines, virtual appliance, distributed database monitoring
     McAfee
By: NetApp     Published Date: Dec 15, 2014
Organizations of all kinds rely on their relational databases for both transaction processing and analytics, but many still have challenges in meeting their goals of high availability, security, and performance. Whether planning for a major upgrade of existing databases or considering a net new project, enterprise solution architects should realize that the storage capabilities will matter. NetApp’s systems, software, and services offer a number of advantages as a foundation for better operational results.
Tags : database, transaction processing, analytics, enterprise solution architects, storage capabilities, storage
     NetApp
By: Network Automation     Published Date: Dec 10, 2008
The challenge of creating a robust errors database, the core of iCheck, is that information must be pulled from a multitude of disparate sources. Sometimes, the data source is a database with its unique schema and field names that must be mapped to the iCheck database. Other times, the data source is a text file which must be accessed via FTP, parsed, and loaded into iCheck. And other times, the data is in RETS (Real Estate Transaction Standard) format, a real estate data format similar to XML, which is accessed via HTTP.
Tags : network automation, data sources, data accuracy, database errors, networking, data management
     Network Automation
By: Oracle     Published Date: May 03, 2017
Traditional backup systems fail to meet the needs of modern organisations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements.
Tags : 
     Oracle
By: Oracle     Published Date: Feb 28, 2018
When application and database numbers increase, how does an organisation avoid overstretching its staff, multiplying costs, and complications? Many companies are using Oracle Exadata—a platform that’s powerful, optimised, and cloud-ready when you are. And they’re seeing, on average, a five-year ROI of 429 percent, 94 percent less unplanned downtime, and 103 percent improvement in transaction rates. See our infographic for more significant findings.
Tags : database, application, revenue, downtime, results, financial
     Oracle
By: Oracle     Published Date: Mar 22, 2018
s your information technology (IT) organization pressured to get more work done with fewer people or on a constricted budget? Do you need to make IT a competitive asset rather than a cost center? Does your business struggle with slow software applications or data that's too often unavailable? If you answered "yes" to any of these questions, it's time to take a close look at Oracle Exadata, the world's fastest database machine exclusively designed to run Oracle Database. It is the first database machine optimized for data warehousing, online transaction processing (OLTP), and database consolidation workloads as well as in-memory databases and database as a service (DBaaS).
Tags : 
     Oracle
By: Oracle     Published Date: Oct 20, 2017
With the growing size and importance of information stored in today’s databases, accessing and using the right information at the right time has become increasingly critical. Real-time access and analysis of operational data is key to making faster and better business decisions, providing enterprises with unique competitive advantages. Running analytics on operational data has been difficult because operational data is stored in row format, which is best for online transaction processing (OLTP) databases, while storing data in column format is much better for analytics processing. Therefore, companies normally have both an operational database with data in row format and a separate data warehouse with data in column format, which leads to reliance on “stale data” for business decisions. With Oracle’s Database In-Memory and Oracle servers based on the SPARC S7 and SPARC M7 processors companies can now store data in memory in both row and data formats, and run analytics on their operatio
Tags : 
     Oracle
By: Oracle     Published Date: Oct 20, 2017
Databases have long served as the lifeline of the business. Therefore, it is no surprise that performance has always been top of mind. Whether it be a traditional row-formatted database to handle millions of transactions a day or a columnar database for advanced analytics to help uncover deep insights about the business, the goal is to service all requests as quickly as possible. This is especially true as organizations look to gain an edge on their competition by analyzing data from their transactional (OLTP) database to make more informed business decisions. The traditional model (see Figure 1) for doing this leverages two separate sets of resources, with an ETL being required to transfer the data from the OLTP database to a data warehouse for analysis. Two obvious problems exist with this implementation. First, I/O bottlenecks can quickly arise because the databases reside on disk and second, analysis is constantly being done on stale data. In-memory databases have helped address p
Tags : 
     Oracle
By: Oracle     Published Date: Oct 20, 2017
Traditional backup systems fail to meet the needs of modern organizations by focusing on backup, not recovery. They treat databases as generic files to be copied, rather than as transactional workloads with specific data integrity, consistency, performance, and availability requirements. Additionally, highly regulated industries, such as financial services, are subject to ever?increasing regulatory mandates that require stringent protection against data breaches, data loss, malware, ransomware, and other risks. These risks require fiduciary?class data recovery to eliminate data loss exposure and ensure data integrity and compliance
Tags : 
     Oracle
Previous   1 2    Next    
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com