Welcome!

SDN Journal Authors: Elizabeth White, Pat Romanski, Liz McMillan, Mark Hoover, Stefan Bernbo

Related Topics: @BigDataExpo, Java IoT, Linux Containers, @CloudExpo, Cloud Security, SDN Journal

@BigDataExpo: Article

Security Through Data Analytics

The best way to protect the infrastructure, the brand and the consumer

Given the mountains of data now floating around, it is perhaps inevitable that the very function of data analytics is seen as somehow intrusive. There's a constant glut of reports, columns and other stories bemoaning the lack of data privacy - and at times, they're entirely justified. Organizations have a solemn duty to protect their customers' data, and those that don't implement the proper safeguards deserve to be vilified.

But beneath the surface lurks another dimension of this discussion that is often overlooked. Ethical and effective data analytics enhances security. Ethical and effective data analytics protects not only the institutions that possess the data, but also the consumers that data reflects. Ethical and effective data analytics serves a good purpose.

Let's be clear about the parameters of this argument. Data doesn't exist in a vacuum - it's generated on an ongoing basis through multiple activities, created in numerous formats and comes in through a variety of channels. At any given time, it is being analyzed and used (and occasionally misused) to serve many different needs.

Of course, when done right, information services and analytics represent a key driver of most business decisions. Actionable intelligence based on real data doesn't just augment gut instinct; it leads to quantitative thinking that supports strategic initiatives, enables tactical outreach and boosts the bottom line. Perhaps most important, it enhances information security so as to protect customer privacy and prevent operational and brand damage.

High-profile assaults on retailers like Target and Neiman Marcus, or clandestine downloads of classified information from the National Security Administration (NSA), make more news than inside-the-infrastructure DDoS attacks, but the latter is even more insidious. There are over 2,000 DDoS attacks every day. Some 65 percent of all organizations see three or more attacks each year. While the devastation is certainly felt on an organizational level, the financial impact is just as significant: DDoS attacks can cost up to $100K an hour.

DDoS mitigation can be an enormous challenge. Making an accurate distinction between normal, benign Internet traffic and malicious activity that could be the root cause of a potential DDoS attack is critical, but it's not easy. This is in part because DDoS attacks, especially when they serve as the front line of advanced targeted attacks, are remarkably sophisticated. They rely on stealth techniques that go unnoticed within the enterprise for long periods. They're highly customized, based specifically on each target's infrastructure and defenses, and can often defeat defense strategies that rely primarily on signature-based detection. Then of course there's the cloud. When attacks become super-sized, the defensive strategies in place must have the capacity to scrub far larger volumes of bad traffic.

This is why information services and analytics are so crucial. They can boost awareness and reaction time to particular situations. When it comes to leveraging Big Data within the enterprise to help identify breach attempts, it's still early days. According to a January 2014 report from Gartner, eight percent of enterprises are using data and analytics to identify security flaws. But there's reason for optimism - the same report also estimates that within the next two years, around 25 percent of enterprises will leverage Big Data for security purposes.

It is this same pattern-searching approach that the enterprise should take when it comes to DDoS mitigation. Proactive site monitoring on a continuous basis - in particular with a centralized view of traffic patterns - enables organizations to identify anomalies and threats, before they become real problems. For example, in the case of a custom application being exploited for a directed attack to steal customer data, the detection solution must be able to identify and highlight the fact that there's a new kind of application traffic on the network.

This might be a new concept to enforce at the enterprise level, but this is really something that banks have been doing for years with regard to fraud protection services. Banks monitor a person's transaction activity, and when a purchase is made that does not fit the usual spending behavior, it is stopped and flagged with the customer. The same thing should - and will - happen at the enterprise level.

It's easy to see why information services and analytics are too often seen as a potential invasion of privacy. Data privacy is vital, and it should rightfully be a corporate priority. However, in the ongoing effort to secure data, the right kind of analytics can be the best weapon of all.

More Stories By Mark Bregman

Mark F. Bregman is Senior Vice President and Chief Technology Officer at Neustar. He joined the Neustar executive team in August 2011 and is responsible for Neustar’s product technology strategy and product development efforts.

Prior to joining Neustar, Dr. Bregman was Executive Vice President and Chief Technology Officer of Symantec since 2006. His portfolio while CTO of Symantec Corporation included developing the company’s technology strategy and overseeing its investments in advanced research and development, security and technology services.

Prior to Symantec, Dr. Bregman served as Executive Vice President, Product Operations at Veritas Corporation, which merged with Symantec in 2005. Prior to Veritas, he was CEO of AirMedia, an early mobile content marketplace, and spent 16 years in a variety of roles at IBM. Dr. Bregman serves on the Board of the Bay Area Science & Innovation Consortium and the Anita Borg Institute, which focuses on increasing the impact of women on all aspects of technology.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
SYS-CON Events announced today that Catchpoint Systems, Inc., a provider of innovative web and infrastructure monitoring solutions, has been named “Silver Sponsor” of SYS-CON's DevOps Summit at 18th Cloud Expo New York, which will take place June 7-9, 2016, at the Javits Center in New York City, NY. Catchpoint is a leading Digital Performance Analytics company that provides unparalleled insight into customer-critical services to help consistently deliver an amazing customer experience. Designed ...
Internet of @ThingsExpo, taking place June 6-8, 2017 at the Javits Center in New York City, New York, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @ThingsExpo New York Call for Papers is now open.
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
"We provide DevOps solutions. We also partner with some key players in the DevOps space and we use the technology that we partner with to engineer custom solutions for different organizations," stated Himanshu Chhetri, CTO of Addteq, in this SYS-CON.tv interview at DevOps at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Hardware virtualization and cloud computing allowed us to increase resource utilization and increase our flexibility to respond to business demand. Docker Containers are the next quantum leap - Are they?! Databases always represented an additional set of challenges unique to running workloads requiring a maximum of I/O, network, CPU resources combined with data locality.
@ThingsExpo has been named the ‘Top WebRTC Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @ThingsExpo ranked as the number one ‘WebRTC Influencer' followed by @DevOpsSummit at 55th.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Using new techniques of information modeling, indexing, and processing, new cloud-based systems can support cloud-based workloads previously not possible for high-throughput insurance, banking, and case-based applications. In his session at 18th Cloud Expo, John Newton, CTO, Founder and Chairman of Alfresco, described how to scale cloud-based content management repositories to store, manage, and retrieve billions of documents and related information with fast and linear scalability. He addres...
Due of the rise of Hadoop, many enterprises are now deploying their first small clusters of 10 to 20 servers. At this small scale, the complexity of operating the cluster looks and feels like general data center servers. It is not until the clusters scale, as they inevitably do, when the pain caused by the exponential complexity becomes apparent. We've seen this problem occur time and time again. In his session at Big Data Expo, Greg Bruno, Vice President of Engineering and co-founder of StackIQ...
The cloud market growth today is largely in public clouds. While there is a lot of spend in IT departments in virtualization, these aren’t yet translating into a true “cloud” experience within the enterprise. What is stopping the growth of the “private cloud” market? In his general session at 18th Cloud Expo, Nara Rajagopalan, CEO of Accelerite, explored the challenges in deploying, managing, and getting adoption for a private cloud within an enterprise. What are the key differences between wh...
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and containers together help companies achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of Dev...