data source

Results 226 - 250 of 741Sort Results By: Published Date | Title | Company Name
By: Dassault Systèmes     Published Date: Jul 21, 2017
Product Lifecycle Management (PLM) software can help your company keep up with the increasing complexity of developing today’s high-tech products. While smaller companies may use relatively simple Product Data Management (PDM) tools, larger companies rely on full-featured PLM systems that help automate processes and share data across global supply chains. Mid-size companies can feel stuck because PDM is too basic, but PLM feels out of reach. This resource will help you: • Recognize why “simple” solutions fall short and do not support your capabilities • Better connect to customers and the supply chain • Drive higher product development speed • Get started with the right PLM solution Midsize manufacturers need a system that quickly delivers the core capabilities they need to streamline product development but also gives them room to grow value over time. So, what’s the right size PLM to fit a midsized high-tech company? Download this resource and take a look.
Tags : product solutions, lifecycle management, tech products, data management tools, pdm, plm, process automation, product development speed, manufactures
     Dassault Systèmes
By: Here Technologies     Published Date: Mar 29, 2019
Rich, real-time location intelligence enables third-party logistics (3PL) companies to deliver pinpoint accuracy and offer superior service, which ultimately means higher volumes, better timelines and more competitive budgets. This ebook looks at seven specific ways in which location data is being used by 3PL companies to enable more accurate fleet routing, precise tracking of vehicles and the ability to meet increasingly demanding customer expectations. As the world’s leading location platform in 2018 (Source: Ovum and Counterpoint Research annual indexes), HERE can help 3PL companies develop key competitive advantages.
Tags : here technologies, transport and logistics
     Here Technologies
By: Here Technologies     Published Date: Mar 29, 2019
By making use of the right technology, transportation and logistics (T&L) companies can reinvent their industry with superior customer service, increased revenues, and reduced costs. This whitepaper looks at the obstacles facing the T&L industry and examines how fleet planning and trip analysis software businesses can help by enabling faster and smarter data-driven decision-making. As the world’s leading location platform in 2018 (Source: Ovum and Counterpoint Research annual indexes) HERE can help T&L companies improve fleet management.
Tags : here technologies, transport and logistics
     Here Technologies
By: Teradata     Published Date: May 02, 2017
Read this article to discover the 4 things no data warehouse should be without.
Tags : cloud data, cloud security, cloud management, storage resource, computing resources, data warehousing, data storage, cloud efficiency
     Teradata
By: Teradata     Published Date: May 02, 2017
even West Media (SWM) is Australia’s leading multiple platform media company with a market-leading presence in broadcast television, magazine and newspaper publishing, and online. To support its unprecedented broadcast of the Rio de Janeiro 2016 Olympic Games, SWM integrated the world’s first production implementation of the Teradata Database running on Amazon Web Services (AWS) into its existing data ecosystem. This enabled the delivery of critical insights during the event and helped directly drive audience engagement with SWM’s digital products, resulting in the creation of a powerful audience asset that that now informs their broadcasting, marketing, and advertising plans. Find out how with in this case study.
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     Teradata
By: Pure Storage     Published Date: Oct 09, 2018
Apache® Spark™ has become a vital technology for development teams looking to leverage an ultrafast in-memory data engine for big data analytics. Spark is a flexible open-source platform, letting developers write applications in Java, Scala, Python or R. With Spark, development teams can accelerate analytics applications by orders of magnitude
Tags : 
     Pure Storage
By: Nice Systems     Published Date: Feb 26, 2019
NICE WFM 7.0’s Forecaster unlocks a high level of transparency into interaction history, allowing you to centrally forecast, schedule and manage contacts between multiple locations and ensure that site- and enterpriselevel objectives are met. With more than two thousand customers and two million users depending on its unparalleled ability to fine-tune the most precise forecasts, Forecaster allows you to plan and respond to the peaks and valleys of customer history through automatic collection of key historical data from all types of contact sources: • Automatic call distributors (ACDs) • Outbound dialers • Multi-channel routing platforms • Back-office employee desktops Download today to learn more.
Tags : 
     Nice Systems
By: NAVEX Global     Published Date: Mar 11, 2014
This report reviews all-industry benchmarks created using data from all participating companies 4,600, in the NAVEX Global database and should serve as an excellent starting point for companies wishing to assess their organization’s reporting data—and help equip them to make informed decisions about programme effectiveness, potential problem areas and necessary resource allocations.
Tags : ethics, compliance, benchmarks, navex
     NAVEX Global
By: NetApp     Published Date: Feb 19, 2015
The NetApp EF550 all-flash array provides a robust platform for delivering exceptional performance to mission-critical applications. The EF550 flash array leverages the latest in solid-state disk technologies along with a strong heritage of handling diverse workloads and providing superior business value through the acceleration of latency-sensitive and high-I/O applications. This technical report provides an overview of workload characterizations performed on NetApp EF-series all-flash arrays across a wide variety of I/O types - with best practices and observations based on extensive test data. Characterizations include IOPS, throughput, and latency under varying block loads and RAID levels, SSD rebuild times, EF540 and EF550 performance comparison, and usable capacity sizing guidelines for various protection levels. If you are considering deploying an EF-series flash storage array in your environment, this is an essential technical resource to insure proper configuration and maxim
Tags : 
     NetApp
By: IBM     Published Date: Feb 22, 2016
To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources.
Tags : ibm, data, performance, scalability, information integration, big data, enterprise applications, data management
     IBM
By: IBM     Published Date: Jul 05, 2016
In an environment where data is the most critical natural resource, speed-of-thought insights from information and analytics are a critical competitive imperative.
Tags : ibm, data warehouse, big data, analytics, data warehouse, business intelligence, data management, data center
     IBM
By: IBM     Published Date: Jul 06, 2016
Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : ibm, big data, trusted data, data management, data solutions, data center
     IBM
By: IBM     Published Date: Jul 14, 2016
How do you keep 130,000 guests safely entertained, fed, watered and informed in a sustainable way? Roskilde Festival knew that the critical insights lay hidden in huge volumes of real-time data. The Copenhagen Business School used IBM technologies to build a cloud big data lab that correlates information from multiple sources, delivering valuable insight for planning and running the festival. Download to learn more.
Tags : ibm, datamart on demand, data, analytics, big data, real-time data, cloud, cloud data analytics, enterprise applications, data center
     IBM
By: IBM     Published Date: Oct 13, 2016
Cloud-based data and processing services present too much opportunity for organizations to ignore. How can organizations realize the obvious financial benefits of the cloud while ensuring information culled from cloud sources is secure and trustworthy? Good hybrid information governance implies several priorities for IT and the business which are based on these foundational pillars: - Broad agreement on what information means - Clear agreement on how owned information assets will be maintained and monitored - Standard practices for securing strategic information assets - Enterprise data integration strategy
Tags : ibm, trusted data, big data, governance, data governance, data management, data center
     IBM
By: IBM     Published Date: Oct 13, 2016
Who's afraid of the big (data) bad wolf? Survive the big data storm by getting ahead of integration and governance functional requirements Today data volumes are exploding in every facet of our lives. Business leaders are eager to harness the power of big data but before setting out into the big data world it is important to understand that as opportunities increase ensuring that source information is trustworthy and protected becomes exponentially more difficult. This paper provides a detailed review of the best practices clients should consider before embarking on their big data integration projects.
Tags : ibm, big data, trusted data, data management, data solutions, data center
     IBM
By: IBM     Published Date: Oct 13, 2016
IBM InfoSphere Information Server connects to many new ‘at rest’ and streaming big data sources, scales natively on Hadoop using partition and pipeline parallelism, automates data profiling, provides a business glossary, and an information catalog, plus also supports IT.
Tags : ibm, data, analytics, big data, data integration, data management, data center
     IBM
By: IBM     Published Date: Jan 27, 2017
Every day, torrents of data inundate IT organizations and overwhelm the business managers who must sift through it all to glean insights that help them grow revenues and optimize profits. Yet, after investing hundreds of millions of dollars into new enterprise resource planning (ERP), customer relationship management (CRM), master data management systems (MDM), business intelligence (BI) data warehousing systems or big data environments, many companies are still plagued with disconnected, “dysfunctional” data—a massive, expensive sprawl of disparate silos and unconnected, redundant systems that fail to deliver the desired single view of the business.
Tags : 
     IBM
By: IBM     Published Date: Jan 27, 2017
Today, all consumers can obtain any piece of data at any point in time. This experience represents a significant cultural shift: the beginning of the democratization of data. However, the data landscape is increasing in complexity, with diverse data types from myriad sources residing in a mix of environments: on-premises, in the cloud or both. How can you avoid data chaos?
Tags : 
     IBM
By: IBM     Published Date: Jan 27, 2017
As with most innovations in business information technology, the ultimate truth about cloud lies somewhere in between. There is little doubt that cloud-based infrastructures offer an immediate opportunity for smaller organizations to avoid the costly investment needed for a robust on-premises computing environment. Data can be found, processed and managed on the cloud without investing in any local hardware. Large organizations with mature on-premises computing infrastructures are looking to Hadoop platforms to help them benefit from the vast array of structured and unstructured data from cloud-based sources. Organizations have feet in both cloud and on-premises worlds. In fact, one could easily argue that we already live in a “hybrid” world.
Tags : 
     IBM
By: IBM     Published Date: Jan 27, 2017
In today’s highly distributed, multi-platform world, the data needed to solve any particular decision making need is increasingly likely to be found across a wide variety of sources. As a result, traditional manual approaches requiring prior collection, storage and integration of extensive sets of data in the analyst’s preferred exploration environment are becoming less useful. Data virtualization, which offers transparent access to distributed, diverse data sources, offers a valuable alternative approach in these circumstances.
Tags : 
     IBM
By: IBM     Published Date: Apr 14, 2017
Any organization wishing to process big data from newly identified data sources, needs to first determine the characteristics of the data and then define the requirements that need to be met to be able to ingest, profile, clean,transform and integrate this data to ready it for analysis. Having done that, it may well be the case that existing tools may not cater for the data variety, data volume and data velocity that these new data sources bring. If this occurs then clearly new technology will need to be considered to meet the needs of the business going forward.
Tags : data integration, big data, data sources, business needs, technological advancements, scaling data
     IBM
By: IBM     Published Date: Apr 18, 2017
Apache Hadoop technology is transforming the economics and dynamics of big data initiatives by supporting new processes and architectures that can help cut costs, increase revenue and create competitive advantage. An effective big data integration solution delivers simplicity, speed, scalability, functionality and governance to produce consumable data. To cut through this misinformation and develop an adoption plan for your Hadoop big data project, you must follow a best practices approach that takes into account emerging technologies, scalability requirements, and current resources and skill levels.
Tags : data integration, data security, data optimization, data virtualization, database security, data migration, data assets, data delivery
     IBM
By: IBM     Published Date: May 09, 2017
How companies are managing growth, gaining insights and cutting costs in the era of big data
Tags : data management, data system, business development, software integration, resource planning, enterprise management, data collection
     IBM
By: Group M_IBM Q2'19     Published Date: Apr 03, 2019
Data is the lifeblood of business. And in the era of digital business, the organizations that utilize data most effectively are also the most successful. Whether structured, unstructured or semi-structured, rapidly increasing data quantities must be brought into organizations, stored and put to work to enable business strategies. Data integration tools play a critical role in extracting data from a variety of sources and making it available for enterprise applications, business intelligence (BI), machine learning (ML) and other purposes. Many organization seek to enhance the value of data for line-of-business managers by enabling self-service access. This is increasingly important as large volumes of unstructured data from Internet-of-Things (IOT) devices are presenting organizations with opportunities for game-changing insights from big data analytics. A new survey of 369 IT professionals, from managers to directors and VPs of IT, by BizTechInsights on behalf of IBM reveals the challe
Tags : 
     Group M_IBM Q2'19
By: Group M_IBM Q2'19     Published Date: Apr 03, 2019
As the information age matures, data has become the most powerful resource enterprises have at their disposal. Businesses have embraced digital transformation, often staking their reputations on insights extracted from collected data. While decision-makers hone in on hot topics like AI and the potential of data to drive businesses into the future, many underestimate the pitfalls of poor data governance. If business decision-makers can’t trust the data within their organization, how can stakeholders and customers know they are in good hands? Information that is not correctly distributed, or abandoned within an IT silo, can prove harmful to the integrity of business decisions.
Tags : 
     Group M_IBM Q2'19
Start   Previous    3 4 5 6 7 8 9 10 11 12 13 14 15 16 17    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com