big data

Results 1 - 25 of 1197Sort Results By: Published Date | Title | Company Name
By: Seagate     Published Date: Jan 27, 2015
This paper is the first to explore a recent breakthrough with the introduction of the High Performance Computing (HPC) industry’s first Intelligence Community Directive (ICD) 503 (DCID 6/3 PL4) certified compliant and secure scale-out parallel file system solution, Seagate ClusterStor™ Secure Data Appliance, which is designed to address government and business enterprise need for collaborative and secure information sharing within a Multi-Level Security (MLS) framework at Big Data and HPC Scale.
Tags : 
     Seagate
By: Intel     Published Date: Aug 06, 2014
Around the world and across all industries, high-performance computing is being used to solve today’s most important and demanding problems. More than ever, storage solutions that deliver high sustained throughput are vital for powering HPC and Big Data workloads.
Tags : intel, enterprise edition lustre software
     Intel
By: Intel     Published Date: Aug 06, 2014
Powering Big Data Workloads with Intel® Enterprise Edition for Lustre* software The Intel® portfolio for high-performance computing provides the following technology solutions: • Compute - The Intel® Xeon processor E7 family provides a leap forward for every discipline that depends on HPC, with industry-leading performance and improved performance per watt. Add Intel® Xeon Phi coprocessors to your clusters and workstations to increase performance for highly parallel applications and code segments. Each coprocessor can add over a teraflop of performance and is compatible with software written for the Intel® Xeon processor E7 family. You don’t need to rewrite code or master new development tools. • Storage - High performance, highly scalable storage solutions with Intel® Enterprise Edition for Lustre* software and Intel® Xeon Processor E7 based storage systems for centralized storage. Reliable and responsive local storage with Intel® Solid State Drives. • Networking - Intel® True Scale Fabric and Networking technologies – Built for HPC to deliver fast message rates and low latency. • Software and Tools: A broad range of software and tools to optimize and parallelize your software and clusters.
Tags : 
     Intel
By: IBM     Published Date: Jun 05, 2014
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : ibm
     IBM
By: IBM     Published Date: Sep 16, 2015
6 criteria for evaluating a high-performance cloud services providers Engineering, scientific, analytics, big data and research workloads place extraordinary demands on technical and high-performance computing (HPC) infrastructure. Supporting these workloads can be especially challenging for organizations that have unpredictable spikes in resource demand, or need access to additional compute or storage resources for a project or to support a growing business. Software Defined Infrastructure (SDI) enables organizations to deliver HPC services in the most efficient way possible, optimizing resource utilization to accelerate time to results and reduce costs. SDI is the foundation for a fully integrated environment, optimizing compute, storage and networking infrastructure to quickly adapt to changing business requirements, and dynamically managing workloads and data, transforming a s
Tags : 
     IBM
By: IBM     Published Date: Sep 16, 2015
According to our global study of more than 800 cloud decision makers and users are becoming increasingly focused on the business value cloud provides. Cloud is integral to mobile, social and analytics initiatives – and the big data management challenge that often comes with them and it helps power the entire suite of game-changing technologies. Enterprises can aim higher when these deployments are riding on the cloud. Mobile, analytics and social implementations can be bigger, bolder and drive greater impact when backed by scalable infrastructure. In addition to scale, cloud can provide integration, gluing the individual technologies into more cohesive solutions. Learn how companies are gaining a competitive advanatge with cloud computing.
Tags : 
     IBM
By: IBM     Published Date: May 20, 2015
Whether in high-performance computing, Big Data or analytics, information technology has become an essential tool in today’s hyper-competitive business landscape. Organizations are increasingly being challenged to do more with less and this is fundamentally impacting the way that IT infrastructure is deployed and managed. In this short e-book, learn the top ten ways that IBM Platform Computing customers are using technologies like IBM Platform LSF and IBM Platform Symphony to help obtain results faster, share resources more efficiently, and improve the overall cost-effectiveness of their global IT infrastructure.
Tags : 
     IBM
By: Bull     Published Date: Dec 04, 2014
Bull, an Atos company, is a leader in Big Data, HPC and cyber-security with a worldwide market presence. Bull has extensive experience in implementing and running petaflopsscale supercomputers. The exascale program is a new step forward in Bull’s strategy to deliver exascale supercomputers capable of addressing the new challenges of science, industry and society.
Tags : bull, exascale, big data, hpc, cyber security, supercomputers
     Bull
By: Adaptive Computing     Published Date: Feb 21, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Intel     Published Date: Sep 16, 2014
In this Guide, we take a look at what an HPC solution like Lustre can deliver for a broad community of business and commercial organizations struggling with the challenge of big data and demanding storage growth.
Tags : intel, lustre*, solution for business
     Intel
By: Dell and Intel®     Published Date: Nov 18, 2015
Unleash the extreme performance and scalability of the Lustre® parallel file system for high performance computing (HPC) workloads, including technical ‘big data’ applications common within today’s enterprises. The Dell Storage for HPC with Intel® Enterprise Edition (EE) for Lustre Solution allows end-users that need the benefits of large–scale, high bandwidth storage to tap the power and scalability of Lustre, with its simplified installation, configuration, and management features that are backed by Dell and Intel®.
Tags : 
     Dell and Intel®
By: General Atomics     Published Date: Jan 13, 2015
The term “Big Data” has become virtually synonymous with “schema on read” (where data is applied to a plan or schema as it is ingested or pulled out of a stored location) unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc. But what if you have unstructured data that, on its own, is hugely valuable, enduring, and created at great expense? Data that may not immediately be human readable or indexable on search? Exactly the kind of data most commonly created and analyzed in science and HPC. Research institutions are awash with such data from large-scale experiments and extreme-scale computing that is used for high-consequence
Tags : general atomics, big data, metadata, nirvana
     General Atomics
By: Amazon Web Services     Published Date: Jul 25, 2018
Organizations are collecting and analyzing increasing amounts of data making it difficult for traditional on-premises solutions for data storage, data management, and analytics to keep pace. Amazon S3 and Amazon Glacier provide an ideal storage solution for data lakes. They provide options such as a breadth and depth of integration with traditional big data analytics tools as well as innovative query-in-place analytics tools that help you eliminate costly and complex extract, transform, and load processes. This guide explains each of these options and provides best practices for building your Amazon S3-based data lake.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
As easy as it is to get swept up by the hype surrounding big data, it’s just as easy for organizations to become discouraged by the challenges they encounter while implementing a big data initiative. Concerns regarding big data skill sets (and the lack thereof), security, the unpredictability of data, unsustainable costs, and the need to make a business case can bring a big data initiative to a screeching halt. However, given big data’s power to transform business, it’s critical that organizations overcome these challenges and realize the value of big data. Download now to find out more.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
Defining the Data Lake “Big data” is an idea as much as a particular methodology or technology, yet it’s an idea that is enabling powerful insights, faster and better decisions, and even business transformations across many industries. In general, big data can be characterized as an approach to extracting insights from very large quantities of structured and unstructured data from varied sources at a speed that is immediate (enough) for the particular analytics use case.
Tags : 
     Amazon Web Services
By: Amazon Web Services     Published Date: Jul 25, 2018
Il est tout aussi facile d'être submergé par l'omniprésent Big Data qu'il l'est pour les organisations d'être découragées par les défis qu'elles rencontrent lorsqu'elles implémentent une initiative en matière de Big Data. Les préoccupations liées aux ensembles de compétences associées au Big Data (et à leur absence), à la sécurité, à l'imprévisibilité des données, aux coûts non viables et à la nécessité d'effectuer une analyse de rentabilité peuvent mettre brutalement fin à une initiative en matière de Big Data.
Tags : 
     Amazon Web Services
By: Dell Server     Published Date: Aug 07, 2018
IT is in the midst of one of its major transformations. IDC has characterized this paradigm shift as the “third platform,” driven by innovations in cloud, big data, mobility and social technologies. Progressive enterprises are seeking to leverage third-platform technologies to create new business opportunities and competitive differentiation through new products and services, new business models and new ways of engaging customers.
Tags : 
     Dell Server
By: IBM     Published Date: Jun 13, 2018
The cloud offers elasticity and flexibility to meet a range of demands
Tags : 
     IBM
By: TIBCO Software APAC     Published Date: Aug 13, 2018
The popularity of integration platform as a service (iPaaS) started with business users looking to gain control and share data among their proliferating SaaS apps?without needing IT intervention. iPaaS was then adopted by IT to support business users to ensure security measures were being maintained and to provide more of a self-service environment. Now, iPaaS has evolved from a niche solution to taking a much bigger role: Read this whitepaper to learn about: Drivers for cloud integration Five emerging uses cases for iPaaS that enable better responsiveness, APIs, event-driven capabilities, human workflows, and data analysis Questions to ask when evaluating your current solution
Tags : 
     TIBCO Software APAC
By: TIBCO Software APAC     Published Date: Aug 13, 2018
Big data has raised the bar for data virtualization products. To keep pace, TIBCO® Data Virtualization added a massively parallel processing engine that supports big-data scale workloads. Read this whitepaper to learn how it works.
Tags : 
     TIBCO Software APAC
By: TIBCO Software APAC     Published Date: Aug 15, 2018
TIBCO Spotfire® Data Science is an enterprise big data analytics platform that can help your organization become a digital leader. The collaborative user-interface allows data scientists, data engineers, and business users to work together on data science projects. These cross-functional teams can build machine learning workflows in an intuitive web interface with a minimum of code, while still leveraging the power of big data platforms. Spotfire Data Science provides a complete array of tools (from visual workflows to Python notebooks) for the data scientist to work with data of any magnitude, and it connects natively to most sources of data, including Apache™ Hadoop®, Spark®, Hive®, and relational databases. While providing security and governance, the advanced analytic platform allows the analytics team to share and deploy predictive analytics and machine learning insights with the rest of the organization, white providing security and governance, driving action for the business.
Tags : 
     TIBCO Software APAC
By: Splunk     Published Date: Sep 10, 2018
The financial services industry has unique challenges that often prevent it from achieving its strategic goals. The keys to solving these issues are hidden in machine data—the largest category of big data—which is both untapped and full of potential. Download this white paper to learn: *How organizations can answer critical questions that have been impeding business success *How the financial services industry can make great strides in security, compliance and IT *Common machine data sources in financial services firms
Tags : cloud monitoring, aws, azure, gcp, cloud, aws monitoring, hybrid infrastructure, distributed cloud infrastructures, reduce mttr/mtti, cloud monitoring free, cloud monitoring tools, cloud monitoring service, cloud billing monitoring, cloud monitoring architecture, cloud data monitoring, host monitoring, *nix, unix, linux, servers
     Splunk
By: IBM     Published Date: Apr 19, 2018
IBM DB2 with BLU Acceleration helps tackle the challenges presented by big data. It delivers analytics at the speed of thought, always-available transactions, future-proof versatility, disaster recovery and streamlined ease-of-use to unlock the value of data.
Tags : db2, data migration, ibm, oracle
     IBM
By: IBM     Published Date: Jul 02, 2018
Digital transformation is not a buzzword. IT has moved from the back office to the front office in nearly every aspect of business operations, driven by what IDC calls the 3rd Platform of compute with mobile, social business, cloud, and big data analytics as the pillars. In this new environment, business leaders are facing the challenge of lifting their organization to new levels of competitive capability, that of digital transformation — leveraging digital technologies together with organizational, operational, and business model innovation to develop new growth strategies. One such challenge is helping the business efficiently reap value from big data and avoid being taken out by a competitor or disruptor that figures out new opportunities from big data analytics before the business does. From an IT perspective, there is a fairly straightforward sequence of applications that businesses can adopt over time that will help put direction into this journey. IDC outlines this sequence to e
Tags : 
     IBM
By: Anaplan     Published Date: Aug 28, 2018
The findings of FSN’s Innovation in the Finance Function global survey prove insightful and compelling—and highlight the critical role that innovative planning technology plays in the field of finance. Organizations today are evolving their operating models in conjunction with rising globalization, the value of big data, technological advances, regulatory changes, and demographic shifts. They are also in pursuit of business growth with the same or—in some cases—fewer resources than before.
Tags : innovation, finance, function, global, study
     Anaplan
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com