app data

Results 1 - 25 of 2601Sort Results By: Published Date | Title | Company Name
By: Seagate     Published Date: Jan 27, 2015
This paper is the first to explore a recent breakthrough with the introduction of the High Performance Computing (HPC) industry’s first Intelligence Community Directive (ICD) 503 (DCID 6/3 PL4) certified compliant and secure scale-out parallel file system solution, Seagate ClusterStor™ Secure Data Appliance, which is designed to address government and business enterprise need for collaborative and secure information sharing within a Multi-Level Security (MLS) framework at Big Data and HPC Scale.
Tags : 
     Seagate
By: SGI     Published Date: Jun 08, 2016
With High Performance Computing (HPC) supercomputer systems that comprise tens, hundreds, or even thousands of computing cores, users are able to increase application performance and accelerate their workflows to realize dramatic productivity improvements. The performance potential often comes at the cost of complexity. By their very nature, supercomputers comprise a great number of components, both hardware and software, that must be installed, configured, tuned, and monitored to maintain maximum efficiency. In a recent report, IDC lists downtime and latency as two of the most important problems faced by data center managers.
Tags : 
     SGI
By: Penguin Computing     Published Date: Mar 23, 2015
The Open Compute Project, initiated by Facebook as a way to increase computing power while lowering associated costs with hyper-scale computing, has gained a significant industry following. While the initial specifications were created for a Web 2.0 environment, Penguin Computing has adapted these concepts to create a complete hardware ecosystem solution that addresses these needs and more. The Tundra OpenHPC system is applicable to a wide range of HPC challenges and delivers the most requested features for data center architects.
Tags : penguin computing, open computing, computing power, hyper-scale computing, tundra openhpc
     Penguin Computing
By: Adaptive Computing     Published Date: Feb 21, 2014
Big data applications represent a fast-growing category of high-value applications that are increasingly employed by business and technical computing users. However, they have exposed an inconvenient dichotomy in the way resources are utilized in data centers. Conventional enterprise and web-based applications can be executed efficiently in virtualized server environments, where resource management and scheduling is generally confined to a single server. By contrast, data-intensive analytics and technical simulations demand large aggregated resources, necessitating intelligent scheduling and resource management that spans a computer cluster, cloud, or entire data center. Although these tools exist in isolation, they are not available in a general-purpose framework that allows them to inter operate easily and automatically within existing IT infrastructure.
Tags : 
     Adaptive Computing
By: Seagate     Published Date: Sep 30, 2015
Although high-performance computing (HPC) often stands apart from a typical IT infrastructure—it uses highly specialized scale-out compute, networking and storage resources—it shares with mainstream IT the ability to push data center capacity to the breaking point. Much of this is due to data center inefficiencies caused by HPC storage growth. The Seagate® ClusterStor™ approach to scale-out HPC storage can significantly improve data center efficiency. No other vendor solution offers the same advantages.
Tags : 
     Seagate
By: IBM     Published Date: Nov 14, 2014
View this demo to learn how IBM Platform Computing Cloud Service running on the SoftLayer Cloud helps you: quickly get your applications deployed on ready-to-run clusters in the cloud; manage workloads seamlessly between on-premise and cloud-based resources; get help from the experts with 24x7 Support; share and manage data globally; and protect your IP through physical isolation of bare metal hardware assets.
Tags : 
     IBM
By: IBM     Published Date: Nov 14, 2014
View this series of short webcasts to learn how IBM Platform Computing products can help you ‘maximize the agility of your distributed computing environment’ by improving operational efficiency, simplify user experience, optimize application using and license sharing, address spikes in infrastructure demand and reduce data management costs.
Tags : 
     IBM
By: Panasas     Published Date: Oct 02, 2014
HPC and technical computing environments require the collection, storage,and transmission of large-scale datasets. To meet these demands, datacenter architects must consider how increasing storage capacity over time will affect HPC workloads, performance, and system availability. While many enterprises have looked to scale-up NAS to meet their storage needs, this approach can lead to data islands that make it difficult to share data. Distributed, scale-out storage was developed to get around the technology limitations of scale-up NAS architectures.
Tags : 
     Panasas
By: General Atomics     Published Date: Jan 13, 2015
The term “Big Data” has become virtually synonymous with “schema on read” (where data is applied to a plan or schema as it is ingested or pulled out of a stored location) unstructured data analysis and handling techniques like Hadoop. These “schema on read” techniques have been most famously exploited on relatively ephemeral human-readable data like retail trends, twitter sentiment, social network mining, log files, etc. But what if you have unstructured data that, on its own, is hugely valuable, enduring, and created at great expense? Data that may not immediately be human readable or indexable on search? Exactly the kind of data most commonly created and analyzed in science and HPC. Research institutions are awash with such data from large-scale experiments and extreme-scale computing that is used for high-consequence
Tags : general atomics, big data, metadata, nirvana
     General Atomics
By: HP     Published Date: Oct 08, 2015
Administrators, engineers and executives are now tasked with solving some of the world’s most complex challenges. This could revolve around advanced computations for science, business, education, pharmaceuticals and beyond. Here’s the challenge – many data centers are reaching peak levels of resource consumption; and there’s more work to be done. So how are engineers and scientists supposed to continue working around such high-demand applications? How can they continue to create ground-breaking research while still utilizing optimized infrastructure? How can a platform scale to the new needs and demands of these types of users and applications. This is where HP Apollo Systems help reinvent the modern data center and accelerate your business.
Tags : apollo systems, reinventing hpc and the supercomputer, reinventing modern data center
     HP
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 09, 2019
Tech advances like the cloud, mobile technology, and the app-based software model have changed the way today’s modern business operates. They’ve also changed the way criminals attack and steal from businesses. Criminals strive to be agile in much the same way that companies do. Spreading malware is a favorite technique among attackers. According to the 2019 Data Breach Investigations Report, 28% of data breaches included malware.¹ While malware’s pervasiveness may not come as a surprise to many people, what’s not always so well understood is that automating app attacks—by means of malicious bots —is the most common way cybercriminals commit their crimes and spread malware. It helps them achieve scale.
Tags : 
     F5 Networks Singapore Pte Ltd
By: BeyondTrust     Published Date: Sep 24, 2019
HOW TO USE THIS BUYER’S GUIDE Today, privileges are built into operating systems, file systems, applications, databases, hypervisors, cloud management platforms, DevOps tools, robotic automation processes, and more. Cybercriminals covet privileges/privileged access because it can expedite access to an organization’s most sensitive targets. With privileged credentials and access in their clutches, a cyberattacker or piece of malware essentially becomes an “insider”.
Tags : 
     BeyondTrust
By: F5 Networks Singapore Pte Ltd     Published Date: Sep 19, 2019
"Safeguarding the identity of users and managing the level of access they have to critical business applications could be the biggest security challenge organizations face in today’s assumed- breach world. Over 6,500 publicly disclosed data breaches occurred in 2018 alone, exposing over 5 billion records—a large majority of which included usernames and passwords.1 This wasn’t new to 2018 though, as evidenced by the existence of an online, searchable database of 8 billion username and password combinations that have been stolen over the years (https://haveibeenpwned.com/), keeping in mind there are only 4.3 billion people worldwide that have internet access. These credentials aren’t stolen just for fun—they are the leading attack type for causing a data breach. And the driving force behind the majority of credential attacks are bots—malicious ones—because they enable cybercriminals to achieve scale. That’s why prioritizing secure access and bot protection needs to be part of every organ
Tags : 
     F5 Networks Singapore Pte Ltd
By: Pure Storage     Published Date: Sep 13, 2019
"By asking the right questions as you design your hybrid cloud, you maximize your chances for success. What do you want to achieve? Identifying your goals will help you zero in on the biggest pain points and attack those first. Description Learn how to increase the effectiveness of your hybrid cloud with a focus on data strategies for running hybrid applications. Use this guide to: Discover key Differences between enterprise IT environments and the public cloud. Learn if you can support enterprise applications in the public cloud. See if you can support cloud-native applications on-premises. Understand if you can protect data cross your hybrid cloud."
Tags : 
     Pure Storage
By: Gigamon     Published Date: Sep 03, 2019
We’ve arrived at the second anniversary of the Equifax breach and we now know much more about what happened due to the August 2018 release of the GAO Report. New information came out of that report that was not well-understood at the time of the breach. For example, did you know that while Equifax used a tool for network layer decryption, they had certificates nine months out of date? This lapse gave the threat actors all the time they needed to break in and exfiltrate reams of personal data. As soon as Equifax updated the certs on their decryption tools, they began to realize what happened. On the heels of the Equifax breach, we are reminded of the importance of efficient decryption for effective threat detection. That’s more important than ever today; Ponemon Institute reports that 50% of all malware attacks utilize encryption. During this webinar, we’ll talk about: -How TLS/SSL encryption has become a threat vector -Why decryption is essential to security and how to effectively pe
Tags : 
     Gigamon
By: Gigamon     Published Date: Sep 03, 2019
The IT pendulum is swinging to distributed computing environments, network perimeters are dissolving, and compute is being distributed across various parts of organizations’ infrastructure—including, at times, their extended ecosystem. As a result, organizations need to ensure the appropriate levels of visibility and security at these remote locations, without dramatically increasing staff or tools. They need to invest in solutions that can scale to provide increased coverage and visibility, but that also ensure efficient use of resources. By implementing a common distributed data services layer as part of a comprehensive security operations and analytics platform architecture (SOAPA) and network operations architecture, organizations can reduce costs, mitigate risks, and improve operational efficiency.
Tags : 
     Gigamon
By: Hitachi EMEA     Published Date: Sep 25, 2019
With the right approach, new and growing data sources can help you drive innovation, transform customer experience and create new revenue streams. But all that data can also slow you down.
Tags : 
     Hitachi EMEA
By: Juniper Networks     Published Date: Sep 26, 2019
"SD-WAN largely exists today to support two key enterprise transformations: multicloud and the software-defined branch (SD-Branch). Multicloud has changed the center of gravity for enterprise applications, and with that, has changed traffic patterns too. No longer does traffic need to flow to enterprise data center sites or central internet peering points and breakouts. That’s because most traffic from users and devices in the enterprise campus and branch today goes to cloud-based applications scattered across a host of clouds. It’s neither economical nor efficient to haul traffic over WAN-dedicated links to a central enterprise site. So to optimize the cost and performance of multicloud-bound traffic, modern WAN edge routers, often called customer premises equipment (CPE), are now equipped with hybrid WAN links and routing. Hybrid WAN interfaces may include WAN provider-dedicated links such as MPLS, as well as direct internet links over xDSL, broadband and 4G/LTE wireless."
Tags : 
     Juniper Networks
By: HERE Technologies     Published Date: Sep 26, 2019
There are many challenging tasks when developing autonomous driving features to cope with the various changes to the environment. Often lane markings are faded or are covered with snow or dirt and can be difficult for a camera-based detection system. In this report, VSI addresses the application of HD map assets to improve the safety and performance of automated vehicle features within the context of lane keeping and trajectories. VSI has been examining applications of HD maps in our test vehicle. In a previous report, we discussed map-based Adaptive Cruise Control (ACC) using the advised speed attributes from HERE’s HD map data. In this report, we apply HERE’s HD map data to a lane keeping application and examine performance of lane keeping with a map-based approach compared to a camera and computer vision-based approach.
Tags : 
     HERE Technologies
By: Cisco Umbrella EMEA     Published Date: Aug 08, 2019
Today’s security appliances and agents must wait until malware reaches the perimeter or endpoint before they can detect or prevent it. OpenDNS arrests attacks earlier in the kill chain. Enforcing security at the DNS layer prevents a malicious IP connection from ever being established or a malicious file from ever being downloaded. This same DNS layer of network security can contain malware and any compromised system from exfiltrating data. Command & control (C2) callbacks to the attacker’s botnet infrastructure are blocked over any port or protocol. Unlike appliances, the cloud service protects devices both on and off the corporate network. Unlike agents, the DNS layer protects every device connected to the network — even IoT. It is the easiest and fastest layer of security to deploy everywhere.
Tags : security, opendns, cisco
     Cisco Umbrella EMEA
By: Cisco Umbrella EMEA     Published Date: Sep 02, 2019
"We live and surf in a cyber world where attacks like APT, DDOS, Trojans and Ransomware are common and easy to execute. Domain names are an integral part of any business today and apparently an integral part of an attacker's plan too. Domain names are carriers of malwares, they act as Command and Control servers and malware's ex-filtrate data too. In today's threat landscape - predicting threats, spotting threats and mitigating them is super crucial.. This is called Visibility and Analytics. Watch this on demand session with our Cisco cloud security experts Shyam Ramaswamy and Fernando Ferrari as they talk about how Cisco Umbrella and The Umbrella Research team detect anomalies, block threats and identify compromised hosts. The experts also discuss how effectively Cisco spot, react, filter out IOC, block the network communications of a malware; identify and stop a phishing campaign (unknown ones too). "
Tags : 
     Cisco Umbrella EMEA
By: Cisco Umbrella EMEA     Published Date: Sep 02, 2019
"Cloud applications provide scale and cost benefits over legacy on-premises solutions. With more users going direct-to-internet from any device, the risk increases when users bypass security controls. We can help you reduce this risk across all of your cloud and on-premises applications with a zero-trust strategy that validates devices and domains, not just user credentials. See why thousands of customers rely on Duo and Cisco Umbrella to reduce the risks of data breaches and improve security. Don’t miss this best-practices discussion focused on the key role DNS and access control play in your zero-trust security strategy. Attendees will learn how to: ? Reduce the risk of phishing attacks and compromised credentials ? Improve speed-to-security across all your cloud applications ? Extend security on and off-network without sacrificing usability"
Tags : 
     Cisco Umbrella EMEA
By: TIBCO Software     Published Date: Jun 14, 2019
As recognized leader in master data management (MDM), and a pioneer in data asset management, TIBCO EBX™ software is an innovative, single solution for managing, governing, and consuming all your shared data assets. It includes all the enterprise class capabilities you need to create data management applications including user interfaces for authoring and data stewardship, workflow, hierarchy management, and data integration tools. And it provides an accurate, trusted view of business functions, insights, and decisions to empower better decisions and faster, smarter actions. Download this datasheet to learn: What makes EBX™ software unique Various capabilities of EBX software The data it manages
Tags : 
     TIBCO Software
By: Tricentis     Published Date: Aug 19, 2019
Think back just 5 years ago. In 2014… • The seminal DevOps book—Gene Kim’s The Phoenix Project—was one year old • Gartner predicted that 25% of Global 2000 enterprises would adopt DevOps to some extent by 20161 • "Continuous Testing” just started appearing in industry publications and conferences2 • Many of today’s popular test frameworks were brand new—or not yet released • The term “microservices” was just entering our lexicon • QC/UFT and ALM were still sold by HP (not even HPE yet) • Only 30% of enterprise software testing was performed fully “in house”3 • There was no GDPR restricting the use of production data for software testing • Packaged apps were typically updated on an annual or semi-annual basis and modern platforms like SAP S/4HANA and Salesforce Lightning hadn’t even been announced Times have changed—a lot. If the way that you’re testing hasn’t already transformed dramatically, it will soon. And the pace and scope of disruption will continue to escalate throughout the fo
Tags : 
     Tricentis
By: Intel     Published Date: Sep 27, 2019
The stakes are high in today's data centers. Organisations have access to massive quantities of data promising valuable insights and new opportunities for business. But data center architects need to rethink and redesign their system architectures to ingest, store and process all that information. Similarly, application owners need to assess how they can process data more effectively. Those who don't re-architect might find themselves scrambling just to keep from being drowned in a data deluge.
Tags : 
     Intel
Start   Previous   1 2 3 4 5 6 7 8 9 10 11 12 13 14 15    Next    End
Search White Papers      

Add White Papers

Get your white papers featured in the insideHPC White Paper Library contact: Kevin@insideHPC.com