data source

Results 151 - 175 of 727Sort Results By: Published Date | Title | Company Name
By: SolidFire_NetApp     Published Date: May 10, 2016
Founded in 1992 as a provider of integrated network, voice and data centre solutions, Colt’s business today has grown to encompass a wide range of IT services, spanning enterprise application hosting, business critical cloud and end-user computing solutions. Colt has 29 data centre locations supporting thousands of customers across 28 countries in Europe and Asia, including Swiss International Airlines, Shurgard, Berenberg, and Jaguar Land Rover. Colt’s award-winning solution portfolio is based on end-to-end data centre, network and IT services capabilities; its aim is to help its customers compete and win in their markets without being held back by hardware, licensing and resource limitations.
Tags : network management, best practices, network optimization, network applications, application management, technology, telecommunications
     SolidFire_NetApp
By: Carbonite     Published Date: Jul 18, 2018
© 2018 Carbonite, Inc. All rights reserved. Case study Diamond Foods’ Diamond of California® nuts are household staples for shoppers across the U.S. But constantly filling grocery store shelves with snacks requires intricate supply chain management that relies on critical business data, including complex spreadsheets and enterprise resource planning files, to keep production and deliveries on schedule. “If our critical servers go down or we lose important data on employee laptops, it has a direct impact on our bottom line,” says Kentrell Davis, Senior Client Support Services Analyst at Diamond Foods.
Tags : 
     Carbonite
By: IBM     Published Date: May 12, 2017
In today’s world, the data is flowing from all directions: social media, phones, weather, location and sensor equipped devices, and more. Competing in this digital age requires the ability to analyze all of this data, and use it to drive decisions that mitigate risk, increase customer satisfaction and grow revenue. Using a combination of proprietary software and open source technology can give your data scientists and statisticians the analytical power they need to find and act on insights quickly. IBM® SPSS® Statistics provides all of the data analysis tools you need, and integrates with thousands of R extensions for maximum power and flexibility. In this next Data Science Central Webinar event, we will show how SPSS Statistics can help you keep up with the influx of new data and make faster, better business decisions without coding.
Tags : ibm, spss, data analysis, statistics, risk mitigation
     IBM
By: IBM     Published Date: Jun 21, 2017
Today, it’s unlikely that a single database will meet all your needs. For a variety of reasons—including the need to support cloud-scale solutions and increasingly dynamic app ecosystems—startups and enterprises alike are embracing a wide variety of open source databases. These varied databases—including MongoDB, Redis and PostgreSQL— open doors to building sophisticated and scalable applications on battle-hardened, non-proprietary databases.
Tags : ibm, data base, cloud, scalability, app ecosystem
     IBM
By: IBM     Published Date: Apr 20, 2017
The growth of virtualization has fundamentally changed the data center and raised numerous questions about data security and privacy. In fact, security concerns are the largest barrier to cloud adoption. Read this e-Book and learn how to protect sensitive data and demonstrate compliance. Virtualization is the creation of a logical rather than an actual physical version of something. such as a storage device, hardware platform, operating system, database or network resource. The usual goal of virtualization is to centralize administrative tasks while improving resilience, scalability and performance and lowering costs. Virtualization is part of an overall trend in enterprise IT towards autonomic computing, a scenario in which the IT environment will be able to manage itself based on an activity or set of activities. This means organizations use or pay for computing resources only as they need them.
Tags : data protection, data security, data optimization, organization optimization, cloud management, virtualization, data center, cloud environment
     IBM
By: IBM     Published Date: May 02, 2017
The growth of virtualization has fundamentally changed the data center and raised numerous questions about data security and privacy. In fact, security concerns are the largest barrier to cloud adoption. Read this e-Book and learn how to protect sensitive data and demonstrate compliance. Virtualization is the creation of a logical rather than an actual physical version of something. such as a storage device, hardware platform, operating system, database or network resource. The usual goal of virtualization is to centralize administrative tasks while improving resilience, scalability and performance and lowering costs. Virtualization is part of an overall trend in enterprise IT towards autonomic computing, a scenario in which the IT environment will be able to manage itself based on an activity or set of activities. This means organizations use or pay for computing resources only as they need them.
Tags : data protection, data security, data optimization, organization optimization, cloud management, virtualization, data center, cloud environment
     IBM
By: IBM     Published Date: Aug 10, 2009
Whatever you need, IBM WebSphere Portal Version 6.1 software can help by making it easy to connect people, information and applications in just the right combination for your business.
Tags : ibm, websphere portal, websphere portal version 6.1, empowering people, information, applications, investments, seamless, data sources, sales force, healthcare, enterprise applications
     IBM
By: SumTotal Systems     Published Date: Oct 10, 2013
Workforce analytics has become an essential business tool for leading companies that view workforce performance as the key to improving company results, according to a new global survey of business leaders by Harvard Business Review Analytics Services. Workforce analytics is a set of integrated capabilities (technologies, metrics, data, and processes) to measure and improve workforce performance. The goal is simple: put the right people with the right skills in the right work, provide them with the necessary training and development opportunities, and engage and empower them to perform at their highest possible level.
Tags : sumtotal, harvard business review, workforce, workforce analytics, analytics, company performance, workforce management, workforce data, workforce analysis, human resources, hr technology
     SumTotal Systems
By: Schneider Electric     Published Date: Jun 04, 2014
When faced with the decision of upgrading an existing data center, building new, or leasing space in a retail colocation data center, there are both quantitative and qualitative differences to consider. The 10-year TCO may favor upgrading or building over outsourcing, however, this paper demonstrates that the economics may be overwhelmed by a business’ sensitivity to cash flow, cash cross-over point, deployment timeframe, data center life expectancy, regulatory requirements, and other strategic factors. This paper discusses how to assess these key factors to help make a sound decision.
Tags : schneider, electric, data, infrastructure, upgrade, retail, outsource, own, data management, data center
     Schneider Electric
By: NEC     Published Date: Aug 12, 2014
Server virtualization is revolutionizing the datacenter by making applications mobile, increasing application uptime, and allowing IT admins to allocate computing resources more efficiently. The technology has been deployed widely enough that the role of the computer server has evolved from directly hosting operating systems and applications to hosting fully virtualized environments. Server that can support more virtualized machines (VMs - complete application stacks) allow their users to gain higher return on their IT investments. Private Cloud can extend the virtualization benefits in ways that broaden the benefits of virtualization to all parts of the organization. In this white paper, you will learn how Corporate IT uses these tools to meet the increasing demand for IT services.
Tags : tier 1, cloud, machines, datacenter, servers, virtualization, customer value, analytics, application owners, system integrators, big data, reliability, enterprise, availability, serviceability, processor, enterprise applications
     NEC
By: Citrix Systems     Published Date: Feb 18, 2012
This paper outlines the compelling benefits of consolidating networking services, and details why competing efforts pursued by F5 technology come up short for critical ADC consolidation projects.
Tags : virtualization, data center, datacenters, consolidate, network, network resources, citrix, netscaler, adc, application delivery controller, infrastructure, consolidation, networking, networking resources
     Citrix Systems
By: Juniper Networks     Published Date: Oct 25, 2017
The primary purpose of containerized applications is to improve the effectiveness of software teams, making it easier for people to work together while lowering the communications overhead. In large enterprises, applications such as ERP or CRM software suites often begin as simple projects, but as time passes, they quickly become clunky and inefficient, with a monolithic code base that slows progress for development teams.
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     Juniper Networks
By: IBM     Published Date: Oct 06, 2014
Business Intelligence (BI) has become a mandatory part of every enterprise’s decision-making fabric. Unfortunately in many cases, with this rise in popularity, came a significant and disturbing complexity. Many BI environments began to have a myriad of moving parts: data warehouses and data marts deployed on multiple platforms and technologies – each requiring significant effort to ensure performance and support for the various needs and skill sets of the business resources using the environment. These convoluted systems became hard to manage or enhance with new requirements. To remain viable and sustainable, they must be simplified. Fortunately today, we have the ability to build simpler BI technical environments that still support the necessary business requirements but without the ensuing management complexity. This paper covers what is needed to simplify BI environments and the technologies that support this simplification.
Tags : data warehouses, bi environments, bi technologies, faster deployments
     IBM
By: Liaison Technologies     Published Date: Feb 07, 2014
Big Data holds the promise of a virtually unlimited ability to gain insight – and foresight – to sharpen strategies and improve tactical execution but only if it can be harnessed and managed. Through this whitepaper, you can learn how.
Tags : liaison, data integration, data management, it resources, business continuity, systems and processes, autonomous acquisitions, information segregation, integration scope, internal resources, hybrid-type, m&a, data processes, it infrastructure, enterprise data, enterprise trends, productivity, hybrid acquisitions, integration phase, data center
     Liaison Technologies
By: Oracle     Published Date: Sep 05, 2014
CMOs face a major dilemma: While 75% of CEOs want marketing to become more ROI-focused and attribute revenue to efforts, they’re also being tasked to innovate and lead their companies into the digital age. Read how the Oracle Marketing Cloud provides marketing leaders with data-driven solutions to unify marketing resources and empower Modern Marketing teams to deliver personalized customer experiences across each channel.
Tags : roi, marketing, cmo, resources, modern, cloud, digital age, resources, innovate, solutions, data-driven, customer, dilemma, solution
     Oracle
By: Oracle     Published Date: Sep 30, 2014
CMOs face a major dilemma: While 75% of CEOs want marketing to become more ROI-focused and attribute revenue to efforts, they’re also being tasked to innovate and lead their companies into the digital age. Read how the Oracle Marketing Cloud provides marketing leaders with data-driven solutions to unify marketing resources and empower Modern Marketing teams to deliver personalized customer experiences across each channel."
Tags : engagement, advocacy, revenue, marketing cloud, oracle, networking
     Oracle
By: McAfee EMEA     Published Date: Nov 15, 2017
Bei einem Software-definierten Rechenzentrum (Software-Defined Data Center, SDDC) handelt es sich um einen ganzheitlichen Ansatz für die Einrichtung besserer Rechenzentren. In der grundlegendsten Form ist ein SDDC eine Kombination aus virtuellen Rechnerressourcen und Software-definierten Speicher- sowie Netzwerksystemen, häufig ergänzt um übergreifende Sicherheitsfunktionen. Mit anderen Worten: Das SDDC fasst nicht nur alle üblicherweise physischen Rechner-, Speicher- sowie Netzwerkaspekte zusammen und automatisiert diese, sondern trägt zusätzlich auch zur Verbesserung der Sicherheit bei.
Tags : 
     McAfee EMEA
By: SAS     Published Date: Jan 17, 2018
We have conditioned patients not only to expect opioids for pain relief, but to utilize more and more of them, and the addiction is both psychological and physical. To remedy the situation, a lot of policies and practices and behaviors must change around how the health care system approaches pain. But we do not yet have the data and analytics we need to determine what specifically to do at the patient level or the policy level. Download this whitepaper to learn more about the resources available and how we can fix this issue.
Tags : 
     SAS
By: CDW     Published Date: Nov 12, 2012
When data center technology such as servers or storage area networks are aligned with specific applications or organizational departments, the result is often inefficiency. Through data center convergence, IT resources can be managed more easily.
Tags : data center convergence, it infrastructure, converged data center, server consolidation, virtualization, data center
     CDW
By: HP     Published Date: Oct 08, 2015
Administrators, engineers and executives are now tasked with solving some of the world’s most complex challenges. This could revolve around advanced computations for science, business, education, pharmaceuticals and beyond. Here’s the challenge – many data centers are reaching peak levels of resource consumption; and there’s more work to be done. So how are engineers and scientists supposed to continue working around such high-demand applications? How can they continue to create ground-breaking research while still utilizing optimized infrastructure? How can a platform scale to the new needs and demands of these types of users and applications. This is where HP Apollo Systems help reinvent the modern data center and accelerate your business.
Tags : apollo systems, reinventing hpc and the supercomputer, reinventing modern data center
     HP
By: Teradata     Published Date: May 02, 2017
even West Media (SWM) is Australia’s leading multiple platform media company with a market-leading presence in broadcast television, magazine and newspaper publishing, and online. To support its unprecedented broadcast of the Rio de Janeiro 2016 Olympic Games, SWM integrated the world’s first production implementation of the Teradata Database running on Amazon Web Services (AWS) into its existing data ecosystem. This enabled the delivery of critical insights during the event and helped directly drive audience engagement with SWM’s digital products, resulting in the creation of a powerful audience asset that that now informs their broadcasting, marketing, and advertising plans. Find out how with in this case study.
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     Teradata
By: IBM Watson Health     Published Date: Nov 10, 2017
To address the volume, velocity, and variety of data necessary for population health management, healthcare organizations need a big data solution that can integrate with other technologies to optimize care management, care coordination, risk identification and stratification and patient engagement. Read this whitepaper and discover how to build a data infrastructure using the right combination of data sources, a “data lake” framework with massively parallel computing that expedites the answering of queries and the generation of reports to support care teams, analytic tools that identify care gaps and rising risk, predictive modeling, and effective screening mechanisms that quickly find relevant data. In addition to learning about these crucial tools for making your organization’s data infrastructure robust, scalable, and flexible, get valuable information about big data developments such as natural language processing and geographical information systems. Such tools can provide insig
Tags : population health management, big data, data, data analytics, big data solution, data infrastructure, analytic tools, predictive modeling
     IBM Watson Health
By: SAS     Published Date: Jun 06, 2018
Data integration (DI) may be an old technology, but it is far from extinct. Today, rather than being done on a batch basis with internal data, DI has evolved to a point where it needs to be implicit in everyday business operations. Big data – of many types, and from vast sources like the Internet of Things – joins with the rapid growth of emerging technologies to extend beyond the reach of traditional data management software. To stay relevant, data integration needs to work with both indigenous and exogenous sources while operating at different latencies, from real time to streaming. This paper examines how data integration has gotten to this point, how it’s continuing to evolve and how SAS can help organizations keep their approach to DI current.
Tags : 
     SAS
By: SAS     Published Date: Aug 28, 2018
Data integration (DI) may be an old technology, but it is far from extinct. Today, rather than being done on a batch basis with internal data, DI has evolved to a point where it needs to be implicit in everyday business operations. Big data – of many types, and from vast sources like the Internet of Things – joins with the rapid growth of emerging technologies to extend beyond the reach of traditional data management software. To stay relevant, data integration needs to work with both indigenous and exogenous sources while operating at different latencies, from real time to streaming. This paper examines how data integration has gotten to this point, how it’s continuing to evolve and how SAS can help organizations keep their approach to DI current.
Tags : 
     SAS
Start   Previous    1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com