The Big Data challenge has an answer in SolidFire

SolidfireIn 2014, about 35% of computer users accessed the cloud, according to the UK Office for National Statistics. That figure has since almost doubled and within three years, 86% of workloads are expected to be processed in cloud data centres.

The quantity of data is doubling every 18 months too and the forecasts are that data production will be 44 times greater in 2020 than it was in 2009. Individuals today create 70% of all data. Enterprises store 80%.

The World Economic Forum has identified six software and services megatrends and the Fourth Industrial Revolution is under way now.

The growth in Big Data is challenging with respect to finding what information is relevant to you, and making sense of it. This is part of the disruption that Big Data is at the heart of.

However, these challenges also present opportunities. For example, data analytics is leading to insights into virtually any field you can think of.

Not long ago a biotech company used data analytics to sort through GBs of data – leading them to isolate 23 optimal genes. And that led to the first gender-specific diagnostic tests for heart disease. Technologies such as SolidFire help to make developments such as these possible. Solidfire offers specific advantages for data storage requirements both now and in the future with the following attributes:

  • Agility: With SolidFire, enterprises can support specific solutions and adapt on the go to multiple workload environments, without affecting the performance of existing applications.
  • Reliability: A key requirement for next generation data centres is repeatable, predictable performance. With SolidFire, businesses can specify and guarantee minimum, maximum, and burst IOPS (input output per second) for individual storage volumes on the fly, independent of capacity.
  • Automation: SolidFire not only has application programme interfaces (APIs) for automating storage management, but also offers automation of every storage function of the array from the API. Data availability is also highly automated.
  • Easy scalability: The Quality of Service (QoS) performance-virtualisation of resources is SolidFire patented. This technology allows businesses to manage storage performance independently from storage capacity. Because of this, SolidFire can deliver predictable storage performance to thousands of applications within a shared infrastructure. This architecture also allows linear scale-out of capacity and performance, as nodes are added. This gives scaling up to 3.4 petabytes of effective capacity and a potential 7.5 million guaranteed IOPS.
  • Redundancy: SolidFire’s architecture does away with sharing of any hardware component in the system. Connectivity between nodes is redundant, so anything in the cluster can fail, and the system will still run. If one or more nodes fail, SolidFire automatically rebuilds redundant data across the other nodes in minutes, restoring full redundancy while maintaining all guaranteed QoS settings.

Remember, without control and the ability to scale, performance is just a Band-Aid!  See a summary of what SolidFire offers by viewing this infographic.