database application

Results 176 - 198 of 198Sort Results By: Published Date | Title | Company Name
By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
"Businesses understand more than ever that they depend on data for insight and competitive advantage. And when it comes to data, they have always wanted easy access and fast performance. But how is the situation different now? Today, organizations want those elements and more. They want IT to strip away the limitations of time with faster deployment of new databases and applications. They want IT to reduce the limitations of distance by giving remote and branch offices better and more reliable access. And in a global world where business never stops, they want IT to ensure data availability around the clock. If IT can deliver databases and applications faster, on a more automated and consistent basis, to more locations without having to commit onsite resources, IT will be free to focus on more strategic projects."
Tags : 
     Oracle PaaS/IaaS/Hardware
By: Oracle PaaS/IaaS/Hardware     Published Date: Jul 25, 2017
"With the introduction of Oracle Database In-Memory and servers with the SPARC S7 and SPARC M7 processors Oracle delivers an architecture where analytics are run on live operational databases and not on data subsets in data warehouses. Decision-making is much faster and more accurate because the data is not a stale subset. And for those moving enterprise applications to the cloud, Real-time analytics of the SPARC S7 and SPARC M7 processors are available both in a private cloud on SPARC servers or in Oracle’s Public cloud in the SPARC cloud compute service. Moving to the Oracle Public Cloud does not compromise the benefits of SPARC solutions. Some examples of utilizing real time data for business decisions include: analysis of supply chain data for order fulfillment and supply optimization, analysis of customer purchase history for real time recommendations to customers using online purchasing systems, etc. "
Tags : 
     Oracle PaaS/IaaS/Hardware
By: Prodiance Corp.     Published Date: Dec 22, 2008
According to Baseline Consulting, approximately 32% of corporate data is contained in enduser computing (EUC) applications and approximately 68% is stored in IT controlled applications. These EUCs – primarily spreadsheets, PC databases (e.g. Access databases), BI reports, and word documents – are often stored on employee desktops and corporate file shares, and for the most part, are uncontrolled. They lack the proper safeguards and controls one would expect with IT controlled applications, including documentation, version control, back-up and archival, change control, testing, security and access control, and more.
Tags : prodiance, end-user computing (euc), risk assessment, automation, monitoring, spreadsheet, link migration, networking
     Prodiance Corp.
By: Progress Software     Published Date: Mar 12, 2014
Big data and cloud data are still hurling outward from the “big bang.” As the dust settles, competing forces are emerging to launch the next round of database wars—the ones that will set new rules for connectivity. Progress® DataDirect Cloud™ is well positioned to help organizations establish data control amid the chaos. How? Information workers at all levels require easy access to multiple data sources. With a premium cloud based connectivity service, they can count on a single, standardized protocol to inform their most business critical applications.
Tags : progess software, datadirect, big data, cloud computing, connectivity, data management, cloud data, database wars
     Progress Software
By: Qualys     Published Date: Jan 07, 2009
Choosing a solution for Vulnerability Management (VM) is a critical step toward protecting your organization's network and data. Without proven, automated technology for precise detection and remediation, no network can withstand the daily onslaught of new vulnerabilities that threaten security.
Tags : qualys, vm solution, vulnerability management, saas, database security, network patching, vulnerability patching, networking
     Qualys
By: Quest Software     Published Date: Jul 28, 2011
In this white paper Quest's data protection experts offer five tips for effective backup and recovery to help you avoid the challenges that might keep you from fully protecting your virtual assets and infrastructure.
Tags : quest, storage virtualization, it management, systems management solutions, virtualization managemen, windows, database, application
     Quest Software
By: Red Hat     Published Date: Jan 20, 2011
In this session, attendees will explore the new features of Red Hat Enterprise Linux 6 performance. John Shakshober (Shak) will share the scalability optimizations for larger SMP systems, > 1TB of memory, new transparent hugepage support, multi-q network performance of 10Gbit and Infiniband, and KVM enhancements that went into Red Hat Enterprise Linux 6. Shak will also share benchmark data from industry standard workloads using common applications such as database servers, Java engines, and various financial applications on the latest Intel and AMD x86_64 hardware.
Tags : red hat, enterprise, linux 6 performance, memory, storage, database servers
     Red Hat
By: Red Hat, Inc.     Published Date: Jul 10, 2012
Is data changing the way you do business?Is it inventory sitting in your warehouse? The good news is data-driven applications enhance online customer experiences, leading to higher customer satisfaction and retention, and increased purchasing.
Tags : it planning, data, data-driven applications, data challenges, data solutions, big data solutions, big data challenges, in-memory databases
     Red Hat, Inc.
By: SAP     Published Date: Jul 17, 2012
Relational database management systems (RDBMSs) are systems of software that manage databases as structured sets of tables containing rows and columns with references to one another through key values. They include the ability to optimize storage, process transactions, perform queries, and preserve the integrity of data structures. When used with applications, they provide the beating heart of the collection of business functions supported by those applications. They vary considerably in terms of the factors that impact the total cost of running a database application, yet users seldom perform a disciplined procedure to calculate such costs. Most users choose instead to remain with a single vendor's RDBMS and never visit the question of ongoing hardware, software, and staffing fees.
Tags : sap, infrastructure, database, data management, white paper, management, storage, business functions
     SAP
By: SAS     Published Date: Oct 18, 2017
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
     SAS
By: SAS     Published Date: Aug 28, 2018
When designed well, a data lake is an effective data-driven design pattern for capturing a wide range of data types, both old and new, at large scale. By definition, a data lake is optimized for the quick ingestion of raw, detailed source data plus on-the-fly processing of such data for exploration, analytics and operations. Even so, traditional, latent data practices are possible, too. Organizations are adopting the data lake design pattern (whether on Hadoop or a relational database) because lakes provision the kind of raw data that users need for data exploration and discovery-oriented forms of advanced analytics. A data lake can also be a consolidation point for both new and traditional data, thereby enabling analytics correlations across all data. To help users prepare, this TDWI Best Practices Report defines data lake types, then discusses their emerging best practices, enabling technologies and real-world applications. The report’s survey quantifies user trends and readiness f
Tags : 
     SAS
By: ScriptLogic     Published Date: Mar 30, 2012
Help Desk Authority offers a complete help desk solution for small and medium size businesses that can help efficiently track, identify, and resolve issues quickly.
Tags : help desk, tracking, tickets, help desk management, active directory, auditing, native tools, centralized
     ScriptLogic
By: ServiceNow     Published Date: Oct 18, 2013
Three IT transformations can help IT get its own house in order to become the proactive partner of the business. By applying these concepts, IT departments at major enterprises are changing the way they engage with their business peers.
Tags : technology, service automation, it workload, it infrastructure, service consolidation, it globalization, managing service relationships, consumerized service experience
     ServiceNow
By: SRC,LLC     Published Date: Jun 01, 2009
Today, organizations are collecting data at every level of their business and in volumes that in the past were unimaginable. Data sets are stored in different database systems or in files with distinctive formats, all reflecting business process, application, program software, or information type dependencies. Adding to this complexity is the distribution of these data sets across the enterprise in silos requiring a varied set of tools and/or specialized business rules for data transformation, classification, matching, and integration. Because of the massive amounts of data stored in a variety of representation formats, decision makers strain to derive insights and create business solutions that adequately span and integrate information from these disparate technology islands. Learn more today!
Tags : src, data transformation, classification, business value, geographic business intelligence, geo-bi, etl, extract
     SRC,LLC
By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to analyzing and mitigating the risks of migrating to PostgreSQL. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. A key decision criteria for adopting any technology is whether it can support requirements for existing applications while also fitting into longer term strategies and needs. The first section of this eBook provides a detailed analysis of all aspects of migrating from legacy and commercial solutions to PostgreSQL: ? Schema and code migration ? Data migration ? Application code migration ? Testing and evaluation
Tags : 
     Stratoscale
By: Stratoscale     Published Date: Feb 01, 2019
This eBook offers a practical hands-on guide to “Day Two” challenges of accelerating large-scale PostgreSQL deployments. With the ongoing shift towards open-source database solutions, it’s no surprise that PostgreSQL is the fastest growing database. While it’s tempting to simply compare the licensing costs of proprietary systems against that of open source, it is both a misleading and incorrect approach when evaluating the potential for return on investment of a database technology migration. After a PostgreSQL deployment is live, there are a variety of day-two scenarios that require planning and strategizing. The third section of this eBook provides a detailed analysis of all aspects accelerating large-scale PostgreSQL deployments: ? Backups and Availability: strategies, point-in-time recovery, availability and scalability ? Upgrades and DevOps: PostgreSQL upgrade process, application upgrades and CI/CD
Tags : 
     Stratoscale
By: Subrago     Published Date: Apr 30, 2009
The key objective of this white paper is to highlight the key issues and discuss processes and controls required to build a high performing IT support organization.
Tags : it support, subrago, it costs, customer transaction, high performing it support, it dependency, tolerance level, production services
     Subrago
By: Vertica     Published Date: Feb 01, 2010
How the Vertica Analytic Database is powering the new wave of commercial software, SaaS and appliance-based applications and creating new value and competitive differentiation for solution developers and their customers.
Tags : vertica, ec2, cdr, elastic, saas, cloud computing, data management, ad-hoc
     Vertica
By: Vertica     Published Date: Oct 30, 2009
Independent research firm Knowledge Integrity Inc. examine two high performance computing technologies that are transitioning into the mainstream: high performance massively parallel analytical database management systems (ADBMS) and distributed parallel programming paradigms, such as MapReduce, (Hadoop, Pig, and HDFS, etc.). By providing an overview of both concepts and looking at how the two approaches can be used together, they conclude that combining a high performance batch programming and execution model with an high performance analytical database provides significant business benefits for a number of different types of applications.
Tags : vertica, analytical computing, adbms, mapreduce, application management, data management, data mining
     Vertica
By: Vertica     Published Date: Jan 19, 2010
Pink OTC Market Inc. is the third largest U.S. equity trading marketing place. Learn how Pink OTC built a highly available and highly reliable (no downtime in a year of production use) data warehouse using Vertica's Analytic DBMS that cost-effectively stores billions of records and scales easily by simply adding CPUs without incurring additional licensing fees.
Tags : pink, adr, data warehousing, vertica, dbms, database, analytical applications
     Vertica
By: VMTurbo     Published Date: Mar 25, 2015
An Intelligent Roadmap for Capacity Planning Many organizations apply overly simplistic principles to determine requirements for compute capacity in their virtualized data centers. These principles are based on a resource allocation model which takes the total amount of memory and CPU allocated to all virtual machines in a compute cluster, and assumes a defined level of over provisioning (e.g. 2:1, 4:1, 8:1, 12:1) in order to calculate the requirement for physical resources. Often managed in spreadsheets or simple databases, and augmented by simple alert-based monitoring tools, the resource allocation model does not account for actual resource consumption driven by each application workload running in the operational environment, and inherently corrodes the level of efficiency that can be driven from the underlying infrastructure.
Tags : capacity planning, vmturbo, resource allocation model, cpu, cloud era, it management, enterprise applications
     VMTurbo
By: VMTurbo     Published Date: Jul 08, 2015
This E-Book covers a brief history of the OpenStack cloud operating system, its releases and projects, significant enterprise adoptions, and the function and contributions VMTurbo.
Tags : cloud strategy, cloud operating systems, enterprise adoption, database management, business integration, web interface, application management, software licensing
     VMTurbo
By: VMware     Published Date: Feb 26, 2009
This flash provides an overview of how CA ARCserve® Backup r12 provides several new disaster recovery enhancements, including an easy-to-use Disaster Recovery wizard. Together with the replication and application failover functionality in CA XOsoft™ Replication and CA XOsoft™ High Availability, and the integrity testing features in CA XOsoft™ Assured Recovery, these tools enable you to recover servers, applications, and data swiftly and effectively.
Tags : ca arcserve, ca xosoft, vmware, restoring database application, bootable cd, bootable tape, cdp, database sessions
     VMware
Start   Previous    1 2 3 4 5 6 7 8     Next   End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com