Agile management changes the business

Solidfire_291116

The 20th century model of business management had anything of consequence being decided at the top. Big leaders appoint little leaders, competition for power is constricting change, work is assigned and control is paramount.

The trouble with this 20th-century model is that it’s 20th-century – it has trouble competing with agile 21st-century businesses. The 21st-century business is a fast adapter and customer centricity is the driver for change and a disrupter for growth.

The older model has the company at the centre, with customers in orbit to be manipulated. In contrast, the modern, agile business model has the customer at the heart with the company in orbit, looking for ways to delight the customer.

In the latter kind of business, everyone in the organisation understands how their work contributes to the focus on the customer.

In agile management:

  •  Leaders see themselves and act as enablers rather than controllers
  •  Work is coordinated through structured, customer-focused practices, not bureaucracy
  •  Leaders model transparency and continuous improvement
  •  Communication is open and conversational, rather than top-down and hierarchical.

Manufacturing is in the early stage of adopting agile management. However, this trend will speed up as physical products become more software-driven and part of the Internet of Things – more and more devices and appliances becoming cyber-connected.

In other sectors, the debate is already over, with attention on improving agile methodologies, learning how to apply them across different teams, and reconciling team goals, practices and values with company goals, values and practices.

One of the United Kingdom’s leading hotel, restaurant and coffee shop operators, which boasts 45,000 employees, recently upgraded its IT systems to become more agile.

The new system gave it a flexible platform, enabling the company to be responsive and adaptable to its market and to other business demands. As a result, its business systems are more easily and widely available and suffer less downtime, and the company is more efficient and productive.

At the heart of this success is the efficient management of data – the traditional data centre just couldn’t cut it.

Next generation data centres are leading the way in enabling agile business management and SolidFire, an all-flash array, is empowering enterprises to adapt to massive IT changes. Even as little as two years ago, solid disk providers didn’t see an all-flash system like SolidFire as a threat, because of its cost. But prices have fallen and are still falling, and all-flash storage offers powerful advantages.

It keeps up with radical change because it can be expanded with no downtime and no need for reconfiguration and it offers faster access to data.

Let’s compare a traditional system with SolidFire:

Traditional System SolidFire
Single tenant Multi-tenant
Isolated workloads Mixed workloads
Dedicated infrastructure Shared infrastructure
Scale-up Scale-out
Manual administration Automation

Next generation data storage has to be at the heart of making your business agile. Download our infographic to see how SolidFire can help you.

 

The Big Data challenge has an answer in SolidFire

SolidfireIn 2014, about 35% of computer users accessed the cloud, according to the UK Office for National Statistics. That figure has since almost doubled and within three years, 86% of workloads are expected to be processed in cloud data centres.

The quantity of data is doubling every 18 months too and the forecasts are that data production will be 44 times greater in 2020 than it was in 2009. Individuals today create 70% of all data. Enterprises store 80%.

The World Economic Forum has identified six software and services megatrends and the Fourth Industrial Revolution is under way now.

The growth in Big Data is challenging with respect to finding what information is relevant to you, and making sense of it. This is part of the disruption that Big Data is at the heart of.

However, these challenges also present opportunities. For example, data analytics is leading to insights into virtually any field you can think of.

Not long ago a biotech company used data analytics to sort through GBs of data – leading them to isolate 23 optimal genes. And that led to the first gender-specific diagnostic tests for heart disease. Technologies such as SolidFire help to make developments such as these possible. Solidfire offers specific advantages for data storage requirements both now and in the future with the following attributes:

  • Agility: With SolidFire, enterprises can support specific solutions and adapt on the go to multiple workload environments, without affecting the performance of existing applications.
  • Reliability: A key requirement for next generation data centres is repeatable, predictable performance. With SolidFire, businesses can specify and guarantee minimum, maximum, and burst IOPS (input output per second) for individual storage volumes on the fly, independent of capacity.
  • Automation: SolidFire not only has application programme interfaces (APIs) for automating storage management, but also offers automation of every storage function of the array from the API. Data availability is also highly automated.
  • Easy scalability: The Quality of Service (QoS) performance-virtualisation of resources is SolidFire patented. This technology allows businesses to manage storage performance independently from storage capacity. Because of this, SolidFire can deliver predictable storage performance to thousands of applications within a shared infrastructure. This architecture also allows linear scale-out of capacity and performance, as nodes are added. This gives scaling up to 3.4 petabytes of effective capacity and a potential 7.5 million guaranteed IOPS.
  • Redundancy: SolidFire’s architecture does away with sharing of any hardware component in the system. Connectivity between nodes is redundant, so anything in the cluster can fail, and the system will still run. If one or more nodes fail, SolidFire automatically rebuilds redundant data across the other nodes in minutes, restoring full redundancy while maintaining all guaranteed QoS settings.

Remember, without control and the ability to scale, performance is just a Band-Aid!  See a summary of what SolidFire offers by viewing this infographic.

Future-storage solutions must be adaptable and flexible

Blog 4 pic

Business managers used to look at data and ask, “How cheaply can we store this stuff?” Not now, though. Now the question is likely to be, “How fast can we get this processed and get the analysis back?”

There’s more data, and more pressure on making sense out of it all.

Consider that 90% of all data in existence originates from the past two years. There is a 65% growth of data a year, which equals 100% growth every 18 months. In other words, we are talking about exabytes, zettabytes of data, and even yottabytes pretty soon, and we’ve pretty much gone past recognising any real boundaries on capacity.

The vast quantities of data involved with mobile devices, smart sensors, the Internet of Things, connected vehicles and more, demand solutions that go beyond adding rows upon rows of last-generation hardware and software to crunch digits. Performance agility and flexibility now are crucial parts of platform architecture.

In 2009, all-flash storage was thought to be a niche installation that would stay like that. Disk economics was too good to be threatened by another medium.

Seven years later we see that all-flash arrays will replace Tier 1 and 2 computer architecture solutions, and (possibly) Tier 3. The future for disk is going to be archival data.

Fujitsu are there now with SolidFire. Our all-flash array can be customised – in size by scaling out, whether four nodes or 100; in capacity, from 20 terabytes to many petabytes; in speed, from 200k IOPS (input output per second), to millions.

With SolidFire and its Quality of Service (QoS) architecture, both SQL (Structured Query Language) and NoSQL (Not only SQL) database workloads run properly without unnecessary expenditure. That’s because SolidFire allows you to create your system to specifically meet your capacity and performance demands.

At the same time, SolidFire is adaptable and flexible. For example, its Quality of Service controls make it simple to mix and match almost any workload types within the shared infrastructure while maintaining predictable performance to each application.

Administrators can choose to run many instances of one type of workload or run any combination of block storage workloads without compromising performance.

Scalable databases, including read-heavy and write-heavy instances, can go onto SolidFire and be protected with QoS.

Do you need to create dozens of distributed web application instances using a rapid cloning process, and then double the number of workloads quickly and without affecting the performance and availability of the running instances?

Not a problem.

Or you might want to stage a production database to a test/development environment, while it’s running, without slowing the performance of the workload.

Easy as.

To learn more about how SolidFire is the all-rounder you need for your future storage needs, read the white paper.