big data

Results 1 - 25 of 1087Sort Results By: Published Date | Title | Company Name
By: Seagate     Published Date: Jan 27, 2015
This paper is the first to explore a recent breakthrough with the introduction of the High Performance Computing (HPC) industry’s first Intelligence Community Directive (ICD) 503 (DCID 6/3 PL4) certified compliant and secure scale-out parallel file system solution, Seagate ClusterStor™ Secure Data Appliance, which is designed to address government and business enterprise need for collaborative and secure information sharing within a Multi-Level Security (MLS) framework at Big Data and HPC Scale.
Tags : 
     Seagate
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads.
Tags : intel, enterprise edition lustre software
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: IBM     Published Date: Jun 05, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: Bull     Published Date: Dec 04, 2014
Bull, an Atos company, is a leader in Big Data, HPC and cyber-security with a worldwide market presence. Bull has extensive experience in implementing and running petaflopsscale supercomputers. The exascale program is a new step forward in Bull’s strategy to deliver exascale supercomputers capable of addressing the new challenges of science, industry and society.
Tags : bull, exascale, big data, hpc, cyber security, supercomputers
     Bull
By: Adaptive Computing     Published Date: Feb 21, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what an HPC solution like Lustre can deliver for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre*, solution for business
     Intel
By: Dell and Intel®     Published Date: Nov 18, 2015
Unleash the extreme performance and scalability of the Lustre® parallel file system for high performance computing (HPC) workloads, including technical ‘big data’ applications common within today’s enterprises. The Dell Storage for HPC with Intel® Enterprise Edition (EE) for Lustre Solution allows end-users that need the benefits of large–scale, high bandwidth storage to tap the power and scalability of Lustre, with its simplified installation, configuration, and management features that are backed by Dell and Intel®.
Tags : 
     Dell and Intel®
By: General Atomics     Published Date: Jan 13, 2015
The term “Big Data” has become virtually synonymous with “schema on read” (where data is applied to a plan or schema as it is ingested or pulled out of a stored location) unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc. But what if you have unstructured data that, on its own, is hugely valuable, enduring, and created at great expense? Data that may not immediately be human readable or indexable on search? Exactly the kind of data most commonly created and analyzed in science and HPC. Research institutions are awash with such data from large-scale experiments and extreme-scale computing that is used for high-consequence
Tags : general atomics, big data, metadata, nirvana
     General Atomics
By: Group M_IBM Q1'18     Published Date: Dec 19, 2017
Effectively using and managing information has become critical to driving growth in areas such as pursuing new business opportunities, attracting and retaining customers, and streamlining operations. In the era of big data, you must accommodate a rapidly increasing volume, variety and velocity of data while extracting actionable business insight from that data, faster than ever before. These needs create a daunting array of workload challenges and place tremendous demands on your underlying IT infrastructure and database systems. This e-book presents six reasons why you should consider a database change, including opinions from industry analysts and real-world customer experiences. Read on to learn more.
Tags : database, streamlining, it infrastructure, database systems
     Group M_IBM Q1'18
By: IBM     Published Date: Oct 17, 2017
Every day, torrents of data inundate IT organizations and overwhelm the business managers who must sift through it all to glean insights that help them grow revenues and optimize profits. Yet, after investing hundreds of millions of dollars into new enterprise resource planning (ERP), customer relationship management (CRM), master data management systems (MDM), business intelligence (BI) data warehousing systems or big data environments, many companies are still plagued with disconnected, “dysfunctional” data—a massive, expensive sprawl of disparate silos and unconnected, redundant systems that fail to deliver the desired single view of the business. To meet the business imperative for enterprise integration and stay competitive, companies must manage the increasing variety, volume and velocity of new data pouring into their systems from an ever-expanding number of sources. They need to bring all their corporate data together, deliver it to end users as quickly as possible to maximize
Tags : 
     IBM
By: IBM     Published Date: Nov 08, 2017
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : ibm, cloud, cloud computing, database, ibm db2
     IBM
By: Group M_IBM Q1'18     Published Date: Jan 23, 2018
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : db2. blu accelaration, big data, data analytics, disaster recovery
     Group M_IBM Q1'18
By: Amazon Web Services     Published Date: Feb 01, 2018
Moving Beyond Traditional Decision Support Future-proofing a business has never been more challenging. Customer preferences turn on a dime, and their expectations for service and support continue to rise. At the same time, the data lifeblood that flows through a typical organization is more vast, diverse, and complex than ever before. More companies today are looking to expand beyond traditional means of decision support, and are exploring how AI can help them find and manage the “unknown unknowns” in our fast-paced business environment.
Tags : predictive, analytics, data lake, infrastructure, natural language processing, amazon
     Amazon Web Services
By: AstuteIT_ABM_EMEA     Published Date: Feb 02, 2018
The demand for databases is on the rise as organizations build next-generation business applications. NoSQL offers enterprise architecture (EA) pros new choices to store, process, and access new data formats, deliver extreme web-scale, and lower data management costs. Forrester’s 26-criteria evaluation of 15 big data NoSQL solutions will help EA pros understand the choices available and recommend the best for their organization. This report details our findings about how each vendor fulfills our criteria and where they stand in relation to each other to help EA.
Tags : nosql, market, industries, strategy, presence, vendor
     AstuteIT_ABM_EMEA
By: Gemini Data     Published Date: Jan 16, 2018
The increasing reliance on big data platforms for all functions of the organization has been transformative. As these environments mature and data volumes increase, organizations face infrastructure and management scalability challenges. Gemini Enterprise Manager simplifies deployment and management with a turnkey, NoOps appliance, providing simplicity, security, and speed to accelerate the time to value for any analysis use case. Manager allows you to control your Splunk deployment as a single, unified solution deployed on premises, in the cloud or both.
Tags : 
     Gemini Data
By: NetApp     Published Date: Dec 13, 2013
Despite the hype, Big Data has introduced critical challenges for modern organizations – and unprepared organizations risk getting buried beneath an avalanche of information. In this informative webcast, join industry and business intelligence (BI) expert Wayne Eckerson, as he tackles the challenges of Big Data. Uncover practical tips and tactics for driving value with your Big Data platform – watch now to learn more.
Tags : big data problems, how to get the most from your big data
     NetApp
By: MIT Academy of Engineering, Pune     Published Date: Mar 05, 2013
A Java application that will successfully be able to retrieve, insert & delete data from our database which will be implemented in HBase along with.Basically the idea is to provide much faster, safer method to transmit & receive huge amounts of data
Tags : 
     MIT Academy of Engineering, Pune
By: SAP     Published Date: Mar 09, 2017
There’s strong evidence organizations are challenged by the opportunities presented by external information sources such as social media, government trend data, and sensor data from the Internet of Things (IoT). No longer content to use internal databases alone, they see big data resources augmented with external information resources as what they need in order to bring about meaningful change. According to a September 2015 global survey of 251 respondents conducted by Harvard Business Review Analytic Services, 78 percent of organizations agree or strongly agree that within two years the use of externally generated big data will be “transformational.” But there’s work to be done, since only 21 percent of respondents strongly agree that external data has already had a transformational effect on their firms.
Tags : 
     SAP
By: SAP     Published Date: Mar 09, 2017
Learn how CIOs can set up a system infrastructure for their business to get the best out of Big Data. Explore what the SAP HANA platform can do, how it integrates with Hadoop and related technologies, and the opportunities it offers to simplify your system landscape and significantly reduce cost of ownership.
Tags : 
     SAP
By: HPE APAC     Published Date: Jun 16, 2017
The bar has been raised higher than ever, and the role of IT is evolving to meet it. As a result, IT must support applications and services that make it possible for the business to provide new, diverse customer experiences while generating expanding revenues via the emergent crown jewels of business: big data, cloud, and mobility. Read on to find out more.
Tags : 
     HPE APAC
By: Cisco EMEA Tier 3 ABM     Published Date: Nov 13, 2017
Big data and analytics is a rapidly expanding field of information technology. Big data incorporates technologies and practices designed to support the collection, storage, and management of a wide variety of data types that are produced at ever increasing rates. Analytics combine statistics, machine learning, and data preprocessing in order to extract valuable information and insights from big data.
Tags : big data, analytics, virtualization, cloudera, ibm, sas, sap, splunk
     Cisco EMEA Tier 3 ABM
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com