hadoop

Results 1 - 25 of 131Sort Results By: Published Date | Title | Company Name
By: Bright Computing     Published Date: May 05, 2014
A successful HPC cluster is a powerful asset for an organization. The following essential strategies are guidelines for the effective operation of an HPC cluster resource: 1. Plan To Manage the Cost of Software Complexity 2. Plan for Scalable Growth 3. Plan to Manage Heterogeneous Hardware/Software Solutions 4. Be Ready for the Cloud 5. Have an answer for the Hadoop Question Bright Cluster Manager addresses the above strategies remarkably well and allows HPC and Hadoop clusters to be easily created, monitored, and maintained using a single comprehensive user interface. Administrators can focus on more sophisticated, value-adding tasks rather than developing homegrown solutions that may cause problems as clusters grow and change. The end result is an efficient and successful HPC cluster that maximizes user productivity.
Tags : bright computing, hpc clusters
     Bright Computing
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
IBM Platform Symphony is a high performance SOA grid server that optimizes application performance and resource sharing. Platform Symphony runs distributed application services on a scalable, shared, heterogeneous grid and accelerates a wide variety of parallel applications, quickly computing results while making optimal use of available infrastructure. Platform Symphony Developer Edition enables developers to rapidly develop and test applications without the need for a production grid. After applications are running in the Developer Edition, they are guaranteed to run at scale once published to a scaled-out Platform Symphony grid. Platform Symphony Developer Edition also enables developers to easily test and verify Hadoop MapReduce applications against IBM Platform Symphony. By leveraging IBM Platform's Symphony's proven, low-latency grid computing solution, more MapReduce jobs can run faster, frequently with less infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Jun 05, 2014
In an audited benchmark conducted by STAC®, the Securities Technology Analysis Center, InfoSphere BigInsights for Hadoop was found to deliver an approximate 4x performance gain on average over open source Hadoop running jobs derived from production workload traces. The result is consistent with an approximate eleven times advantage in raw scheduling performance provided by Adaptive MapReduce – a new InfoSphere BigInsights for Hadoop feature that leverages high-performance computing technology from IBM Platform Computing.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 16, 2015
Are you trying to support more variable workloads than your environment can handle? Can you benefit from a high performance cluster, but do not have the budget or resources to deploy and manage technical computing infrastructure? Are you running out of data center space but still need to grow your compute capacity? If you want answers to any of these questions, then please join us for an informative webinar describing the advantages and pitfalls of relocating a high performance workload to the cloud. View this webinar to learn: - Why general purpose clouds are insufficient for technical computing analytics and Hadoop workloads; - How high performance clouds can improve your profitability and give you a competitive edge; - How to ensure that your cloud environment is secure; - How to evaluate which applications are suitable for a hybrid or public cloud environment; - How to get started and choose a service provider.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Are you trying to support more variable workloads than your environment can handle? Can you benefit from a high performance cluster, but do not have the budget or resources to deploy and manage technical computing infrastructure? Are you running out of data center space but still need to grow your compute capacity? If you want answers to any of these questions, then please join us for an informative webinar describing the advantages and pitfalls of relocating a high performance workload to the cloud. View this webinar to learn: - Why general purpose clouds are insufficient for technical computing analytics and Hadoop workloads; - How high performance clouds can improve your profitability and give you a competitive edge; - How to ensure that your cloud environment is secure; - How to evaluate which applications are suitable for a hybrid or public cloud environment; - How to get started and choose a service provider.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
IBM® has created a proprietary implementation of the open-source Hadoop MapReduce run-time that leverages the IBM Platform™ Symphony distributed computing middleware while maintaining application-level compatibility with Apache Hadoop.
Tags : 
     IBM
By: General Atomics     Published Date: Jan 13, 2015
The term “Big Data” has become virtually synonymous with “schema on read” (where data is applied to a plan or schema as it is ingested or pulled out of a stored location) unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc. But what if you have unstructured data that, on its own, is hugely valuable, enduring, and created at great expense? Data that may not immediately be human readable or indexable on search? Exactly the kind of data most commonly created and analyzed in science and HPC. Research institutions are awash with such data from large-scale experiments and extreme-scale computing that is used for high-consequence
Tags : general atomics, big data, metadata, nirvana
     General Atomics
By: IBM     Published Date: Apr 18, 2017
Apache Hadoop technology is transforming the economics and dynamics of big data initiatives by supporting new processes and architectures that can help cut costs, increase revenue and create competitive advantage. An effective big data integration solution delivers simplicity, speed, scalability, functionality and governance to produce consumable data. To cut through this misinformation and develop an adoption plan for your Hadoop big data project, you must follow a best practices approach that takes into account emerging technologies, scalability requirements, and current resources and skill levels.
Tags : data integration, data security, data optimization, data virtualization, database security, data migration, data assets, data delivery
     IBM
By: NetApp     Published Date: Dec 13, 2013
FlexPod Select with Hadoop delivers enterprise class Hadoop with validated, pre-configured components for fast deployment, higher reliability and smoother integration with existing applications and infrastructure.  These technical reference architectures optimize storage, networking, and servers with Cloudera and Hortonworks distributions of Hadoop. Leverage FlexPod Select with Hadoop to help store, manage, process and perform advanced analytics on your multi-structured data.   Tuning parameters, optimization techniques among other Hadoop cluster guidelines  are provided.
Tags : flexpod with hadoop, enterprise data, storage infrastructure, massive amounts of data
     NetApp
By: NetApp     Published Date: Dec 13, 2013
Interested in running a Hadoop proof of concept on enterprise-class storage? Download this solutions guide to get a technical overview on building Hadoop on NetApp E-series storage. NetApp Open Solution for Hadoop delivers big analytics with preengineered, compatible, and supported solutions based on high-quality storage platforms so you reduce the cost, schedule, and risk of do-it-yourself systems and relieving the skills gap most organizations have with Hadoop. See how on going operational and maintenance costs can be reduced with a high available and scalable Hadoop solution.
Tags : open solutions, hadoop solutions guide
     NetApp
By: NetApp     Published Date: Dec 13, 2013
Learn why NetApp Open Solution for Hadoop is better than clusters built on commodity storage. This ESG lab report details the reasons why NetApp's use of direct attached storage for Hadoop improves performance, scalability and availability compared to typical internal hard drive Hadoop deployments.
Tags : netapp open solution for hadoop, direct attached storage
     NetApp
By: MIT Academy of Engineering, Pune     Published Date: Mar 05, 2013
A Java application that will successfully be able to retrieve, insert & delete data from our database which will be implemented in HBase along with.Basically the idea is to provide much faster, safer method to transmit & receive huge amounts of data
Tags : 
     MIT Academy of Engineering, Pune
By: SAP     Published Date: Mar 09, 2017
Learn how CIOs can set up a system infrastructure for their business to get the best out of Big Data. Explore what the SAP HANA platform can do, how it integrates with Hadoop and related technologies, and the opportunities it offers to simplify your system landscape and significantly reduce cost of ownership.
Tags : 
     SAP
By: Pentaho     Published Date: Feb 26, 2015
This TDWI Best Practices report explains the benefits that Hadoop and Hadoop-based products can bring to organizations today, both for big data analytics and as complements to existing BI and data warehousing technologies.
Tags : big data, big data analytics, data warehousing technologies, data storage, business intelligence, data integration, enterprise applications, data management
     Pentaho
By: Pentaho     Published Date: Nov 04, 2015
This report explains the benefits that Hadoop and Hadoop-based products can bring to organizations today, both for big data analytics and as complements to existing BI and data warehousing technologies based on TDWI research plus survey responses from 325 data management professionals across 13 industries. It also covers Hadoop best practices and provides an overview of tools and platforms that integrate with Hadoop.
Tags : pentaho, analytics, platforms, hadoop, big data, predictive analytics, data management, networking
     Pentaho
By: Pentaho     Published Date: Nov 04, 2015
Although the phrase “next-generation platforms and analytics” can evoke images of machine learning, big data, Hadoop, and the Internet of things, most organizations are somewhere in between the technology vision and today’s reality of BI and dashboards. Next-generation platforms and analytics often mean simply pushing past reports and dashboards to more advanced forms of analytics, such as predictive analytics. Next-generation analytics might move your organization from visualization to big data visualization; from slicing and dicing data to predictive analytics; or to using more than just structured data for analysis.
Tags : pentaho, analytics, platforms, hadoop, big data, predictive analytics, networking, it management
     Pentaho
By: Dell EMC     Published Date: Nov 09, 2015
This business-oriented white paper summarizes the wide-ranging benefits of the Hadoop platform, highlights common data processing use cases and explores examples of specific use cases in vertical industries. The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell and its partners Cloudera and Intel.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Nov 09, 2015
While the EDW plays an all-important role in the effort to leverage big data to drive business value, it is not without its challenges. In particular, the typical EDW is being pushed to its limits by the volume, velocity and variety of data. Download this whitepaper and see how the Dell™ | Cloudera™ | Syncsort™ Data Warehouse Optimization – ETL Offload Reference Architecture can help.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
The information presented here draws on the collective experiences of three leaders in the use of Hadoop technologies—Dell® and its partners Cloudera® and Intel®.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
This paper discusses how the many Dell | Cloudera Hadoop solutions help organizations of all sizes, and with a variety of needs and use cases, tackle their big data requirements.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
In order to protect big data today, organizations must have solutions that address four key areas: authentication, authorization, audit and lineage, and compliant data protection.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
To compete in this new multi-channel environment, we’ve seen in this guide how retailers have to adopt new and innovative strategies to attract and retain customers. Big data technologies, specifically Hadoop, enable retailers to connect with customers through multiple channels at an entirely new level by harnessing the vast volumes of new data available today. Hadoop helps retailers store, transform, integrate and analyze a wide variety of online and offline customer data—POS transactions, e-commerce transactions, clickstream data, email, social media, sensor data and call center records—all in one central repository.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 08, 2015
Download this white paper to learn how the company deployed a Dell and Hadoop cluster based on Dell and Intel® technologies to support a new big data insight solution that gives clients a unified view of customer data.
Tags : 
     Dell EMC
By: Dell EMC     Published Date: Oct 12, 2015
This business-oriented white paper explains four options for starting your Hadoop journey. This paper also outlines the benefits of Hadoop and highlights some of the many use cases for this new approach to managing, storing and processing big data.
Tags : 
     Dell EMC
Start   Previous   1 2 3 4 5 6    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com