Click here to close now.

Welcome!

SDN Journal Authors: Dana Gardner, Lori MacVittie, Carmen Gonzalez, Liz McMillan, Aria Blog

Related Topics: SDN Journal, Java, SOA & WOA, Linux, Virtualization, Cloud Expo

SDN Journal: Article

SolidFire Delivers True Storage Agility to the Next Generation Data Center

Element OS Version 6 Provides the Most Complete Range of Enterprise Class Features in Any All-Flash Array

SolidFire has introduced Version 6 of its Element OS, named Carbon, and a new set of enterprise class features into its all-flash array. Building on tremendous success as the benchmark storage architecture for large-scale cloud service providers, SolidFire is rolling out new features that pave the way for enterprises striving to deliver a more agile, automated, and scalable storage infrastructure. New functionality will be generally available in Q2 2014.

The increasing pressures on Enterprise IT
The era of cloud computing has dramatically raised the bar on both the speed and cost at which Enterprise IT services are delivered and consumed. This radical shift in expectations has become a driving force behind the transformation of enterprise data centers worldwide.

"Storage is at the core of the next generation data center," commented Dave Wright, SolidFire Founder and CEO, "And neither traditional disk systems nor today's basic all-flash arrays are supporting this transformation in resource allocation and management. Our customers expect great performance from us, but they also expect us to support their broader business objectives to deliver internal storage services that are more agile, scalable, automated, and predictable than ever before."

Bringing storage agility to the Next Generation Data Center
SolidFire recently previewed the features of Element OS 6, and introduced key customers Internap, SunGard and ServInt, before an audience of more than 35 industry analysts and influencers from around the world at the company's first Analyst Day in Boulder, Colorado.

"Solidfire attacks what to me is the most glaring missing element in tomorrow's enterprise data center -- Quality of Service," commented Steve Duplessie, founder and senior analyst of the Enterprise Strategy Group. "As more and more applications are delivered from shared storage infrastructure -- performance predictability and scale have become paramount. That's been the problem with traditional storage architectures in the modern era of infrastructure virtualization."

With this release, SolidFire is introducing a combination of unique features that smooth the enterprise transition to Next Generation Data Center technologies. These new features include:

Introduction of Fibre Channel Connectivity: Adding to their 10Gb iSCSI connectivity, SolidFire introduces 16Gb active / active Fibre Channel (FC) connectivity to its full line of all-flash arrays -- SF3010, SF6010, and SF9010. This added functionality enables enterprise customers to easily transition current FC workloads and take advantage of SolidFire's guaranteed storage performance, system automation, and scale-out architecture.

Real-Time Replication: SolidFire's Real-Time Replication technology enables the quick and cost-effective creation of additional remote copies of data. Native to the SolidFire design, this functionality delivers essential disaster recovery capabilities to CSP and enterprise customers without the need for third party hardware or software. The SolidFire replication model is extremely flexible, each cluster can be paired with up to four other clusters and replicate data in either direction allowing for easy failover and failback.

Mixed-Node Cluster Support: SolidFire storage systems now support the combination of storage nodes of different capacity, performance, and protocols within a single cluster. Within every SolidFire storage system, capacity and performance are managed as two global and separate resource pools. When new storage nodes are added to a cluster, additional capacity and performance are made immediately available to both existing applications and new workloads. Additionally, Mixed-Node Cluster Support allows enterprise customers to continually leverage the economics of the most current flash technology in the market while providing long term investment protection.

"With mixed node support, SolidFire has eradicated the concept of 'generational' or 'forklift upgrades' common with traditional disk and other all-flash storage systems," discussed Matt Loschert, CTO of managed hosting provider ServInt. "As we scale our storage infrastructure we simply add the most current SolidFire platform without downtime or impact to our hosted customers -- resources are instantly available. Decommissioning systems is as simple as adding them. We can take them off line without compromising availability or any of the Quality of Service (QoS) settings that we have established with our customers."

Integrated Backup & Restore: This unique SolidFire functionality provides native snapshot-based backup and restore functionality compatible with any object store or device that has an S3 or SWIFT compatible API. This first-of-its-kind functionality eliminates the cost and complexity of third party backup & recovery products, while dramatically accelerating backup performance. CSP and Enterprise customers can now effortlessly scale backups for thousands of hosts and applications.

For more information on the SolidFire's all-flash storage systems, including this new release, please see http://www.solidfire.com or schedule a live demo today.

More Stories By Liz McMillan

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

@CloudExpo Stories
NaviSite, Inc., a Time Warner Cable company, has opened a new enterprise-class data center located in Santa Clara, California. The new data center will enable NaviSite to meet growing demands for its enterprise-class Cloud and Managed Services from existing and new customers. This facility, which is owned by data center solution provider Digital Realty, will join NaviSite’s fabric of nine existing data centers across the U.S. and U.K., all of which are designed to provide a resilient, secure, hi...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
When it comes to the Internet of Things, hooking up will get you only so far. If you want customers to commit, you need to go beyond simply connecting products. You need to use the devices themselves to transform how you engage with every customer and how you manage the entire product lifecycle. In his session at @ThingsExpo, Sean Lorenz, Technical Product Manager for Xively at LogMeIn, will show how “product relationship management” can help you leverage your connected devices and the data th...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
SYS-CON Events announced today that CodeFutures, a leading supplier of database performance tools, has been named a “Sponsor” of SYS-CON's 16th International Cloud Expo®, which will take place on June 9–11, 2015, at the Javits Center in New York, NY. CodeFutures is an independent software vendor focused on providing tools that deliver database performance tools that increase productivity during database development and increase database performance and scalability during production.
The IoT market is projected to be $1.9 trillion tidal wave that’s bigger than the combined market for smartphones, tablets and PCs. While IoT is widely discussed, what not being talked about are the monetization opportunities that are created from ubiquitous connectivity and the ensuing avalanche of data. While we cannot foresee every service that the IoT will enable, we should future-proof operations by preparing to monetize them with extremely agile systems.
There’s Big Data, then there’s really Big Data from the Internet of Things. IoT is evolving to include many data possibilities like new types of event, log and network data. The volumes are enormous, generating tens of billions of logs per day, which raise data challenges. Early IoT deployments are relying heavily on both the cloud and managed service providers to navigate these challenges. Learn about IoT, Big Data and deployments processing massive data volumes from wearables, utilities and ot...
SYS-CON Events announced today that Intelligent Systems Services will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1994, Intelligent Systems Services Inc. is located near Washington, DC, with representatives and partners nationwide. ISS’s well-established track record is based on the continuous pursuit of excellence in designing, implementing and supporting nationwide clients’ mission-cri...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo (http://www.CloudComputingExpo.com...
There has been a lot of discussion recently in the DevOps space over whether there is a unique form of DevOps for large enterprises or is it just vendors looking to sell services and tools. In his session at DevOps Summit, Chris Riley, a technologist, discussed whether Enterprise DevOps is a unique species or not. What makes DevOps adoption in the enterprise unique or what doesn’t? Unique or not, what does this mean for adopting DevOps in enterprise size organizations? He also explored differe...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
DevOps tends to focus on the relationship between Dev and Ops, putting an emphasis on the ops and application infrastructure. But that’s changing with microservices architectures. In her session at DevOps Summit, Lori MacVittie, Evangelist for F5 Networks, will focus on how microservices are changing the underlying architectures needed to scale, secure and deliver applications based on highly distributed (micro) services and why that means an expansion into “the network” for DevOps.
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...