7 Must-Knows When Considering a Data Centre Solution

Aampliy_Blog_PRIMERGYInformation and communications technology downtime and delays are on average costing businesses $9000 per minute ($540,000 per hour) according to the Ponemon Institute. The financial services sector took top honours with nearly a million dollars in costs per outage.

But it’s not all bad news.

The cost of fixing the problem is relatively small, as a little investment in increasing the reliability of ICT will provide ROI by reducing productivity and revenue losses – and Investment in the Australian data centre services market is predicted to grow at a CAGR of 12.4% in the next 5 years.

This is essential in today’s fast world of big data which is leaving old ICT systems further and further behind. With Australian enterprises putting an increasing focus on scalability, standards, and security, legacy systems just don’t cut it. Combine this with the fact that companies are lacking the expert knowledge and services to keep up with swift IT advances and you have a landscape full of laggards, not leaders.

As such, one of the most challenging aspects faced by the CIOs is how to keep their IT strategy aligned with the business strategy, and their aged data centres upgraded and running with efficiency.

Improving overall data centre functionality and performance calls for modernising your existing data centre through the latest technologies, infrastructure and services. A data centre which is highly responsive, agile and sustainable, reduces operational costs, risk and is future proofed for expansion.

Your data centre has the potential to drive your business forward and help you be a leader in this fast-paced world. Here are our top 7 must-knows when considering a data centre solution:

  1. Scalability

One of the major benefits you can bring to your business with a robust data centre solution is the ability to add data storage capacity to meet your emerging business requirements.  This means you pay only for the capacity you need, knowing that you can easily scale to meet increasing data volume demands.

  1. Security

Maintaining the confidentiality, integrity and availability of information is critical to the success of your organisation. Place your information in a facility that offers state-of-the-art digital surveillance and security equipment for prevention of unauthorised access. Consider measures, such as biometric access points, 24-hour on-site security controls, integrated access management and CCTV systems.

  1. Reliability & High Availability

Reliability and high availability are not the same. For example, a reliable system that takes a long time to fix if something fails does not have high availability. Data centres that have both are those that are engineered to international standards of excellence ensuring appropriate controls are in place.

Keep these key national standards for data centre facilities in mind when looking at your options:

  • Information Security Management Systems (ISMS) ISO 27001
  • Government Security Standards
  • Environmental Management System ISO 14001
  • ITIL IT Service Management
  • ISO 9001 Quality System
  1. Energy Efficiency

Having reliable and high-performance computing power is a crucial aspect of running your business effectively and competitively. It is also highly energy intensive. In Australia, the NABERS Energy rating for data centres can help CIOs, IT Managers and tenants assess energy performance for an outsourced data centre reliably and as a comparison point with other data centres within Australia.

This provides a way for data centre energy efficiency to be externally validated against a standard by an independent government assessor.

  1. Services

Choose a flexible provider who can offer professional expertise, support and services as your business and circumstance change, including:

  • Co-location services
  • Add-on services – On-site services, such as remote hands and media management
  • Project services – Relocation, installation and consolidation from your premises or third party location into a managed data centre
  • Managed services – End-to-end delivery of services and support.
  1. Servers

Servers are the big users of energy in a data centre which is why you achieve greater efficiency by running those that consume less power but still provide best-in-class performance. Your data centre provider can run theirs, or you can supply your own for use in a facility, but it’s critical to opt for servers which will allow you to deliver at the speed your enterprise demands. Bookmark this reference to Fujitsu PRIMERGY servers as a good starting point.

  1. Location

If you’re considering a co-location data centre solution, the location of the facility (or facilities) you will use is an integral factor. If someone from your company will be upgrading or servicing your equipment when needed, you need easy access to this location. Another aspect to keep in mind when assessing locations is to investigate the likelihood of natural disasters and what redundancy operations are in place.

Keep pace with the world moving at a digital speed and be the leader, not the laggard. Fast track your data centre modernisation by understanding the key transformation factors.

infographic

 

Agile management changes the business

Solidfire_291116

The 20th century model of business management had anything of consequence being decided at the top. Big leaders appoint little leaders, competition for power is constricting change, work is assigned and control is paramount.

The trouble with this 20th-century model is that it’s 20th-century – it has trouble competing with agile 21st-century businesses. The 21st-century business is a fast adapter and customer centricity is the driver for change and a disrupter for growth.

The older model has the company at the centre, with customers in orbit to be manipulated. In contrast, the modern, agile business model has the customer at the heart with the company in orbit, looking for ways to delight the customer.

In the latter kind of business, everyone in the organisation understands how their work contributes to the focus on the customer.

In agile management:

  •  Leaders see themselves and act as enablers rather than controllers
  •  Work is coordinated through structured, customer-focused practices, not bureaucracy
  •  Leaders model transparency and continuous improvement
  •  Communication is open and conversational, rather than top-down and hierarchical.

Manufacturing is in the early stage of adopting agile management. However, this trend will speed up as physical products become more software-driven and part of the Internet of Things – more and more devices and appliances becoming cyber-connected.

In other sectors, the debate is already over, with attention on improving agile methodologies, learning how to apply them across different teams, and reconciling team goals, practices and values with company goals, values and practices.

One of the United Kingdom’s leading hotel, restaurant and coffee shop operators, which boasts 45,000 employees, recently upgraded its IT systems to become more agile.

The new system gave it a flexible platform, enabling the company to be responsive and adaptable to its market and to other business demands. As a result, its business systems are more easily and widely available and suffer less downtime, and the company is more efficient and productive.

At the heart of this success is the efficient management of data – the traditional data centre just couldn’t cut it.

Next generation data centres are leading the way in enabling agile business management and SolidFire, an all-flash array, is empowering enterprises to adapt to massive IT changes. Even as little as two years ago, solid disk providers didn’t see an all-flash system like SolidFire as a threat, because of its cost. But prices have fallen and are still falling, and all-flash storage offers powerful advantages.

It keeps up with radical change because it can be expanded with no downtime and no need for reconfiguration and it offers faster access to data.

Let’s compare a traditional system with SolidFire:

Traditional System SolidFire
Single tenant Multi-tenant
Isolated workloads Mixed workloads
Dedicated infrastructure Shared infrastructure
Scale-up Scale-out
Manual administration Automation

Next generation data storage has to be at the heart of making your business agile. Download our infographic to see how SolidFire can help you.

 

The new data storage model for radical IT innovation

Solidfire_251116You’ve undoubtedly heard about the Internet of Things (IoT), in which almost every device or appliance is connected to the Internet. The collective intelligence that can be gained from these devices will forever change the activities and lives of businesses and consumers everywhere.

It is probable that IoT will impact not just connected devices, but many devices that are unconnected yet still have a current running through them.

We’re not going to get into the details of that in this article, but the point is simply this: everything you’ve heard about the Fourth Industrial Revolution – like the explosion in IoT, cloud computing and Big Data – is likely to prove an understatement.

Recent research from Gartner shows that almost 50 percent of businesses have invested in Big Data initiatives, but most are not getting the best out of their expenditure. The reason: insufficient planning or structure around how to find what they need from the data, and how to best use it.

Given the pace and scale of technological change, that’s understandable. But what can you do about it?

Some traditional companies are now getting advice on how to revamp their IT systems and workplaces so they can leverage the vast computing resources of the Cloud – so they have a chance to keep up with newer, faster-moving businesses.

Businesses struggling to find what they need from huge information flows may take some encouragement from Google. The company has announced the development of an artificial intelligence (AI) engine that it says represents the biggest shift in computing since the appearance of the smartphone.

The company says its massive search database, which now holds about 17 billion facts, will help its AI engine answer queries.

But even when that kind of technology filters down to more general use, it will still be a poor idea to let AI deal with a random assortment of information. A modern, versatile storage system will be needed to provide the dedicated, blistering performance requirement to get the most benefit from any kind of search in an acceptable time frame – whether AI or something else.

Solidfire is a storage system ideal for businesses wanting to ease the transition to the Fourth Industrial Revolution. It’s an all-flash array system, meaning it has no spinning disks. This offers a number of major advantages:

  • Install only the capacity you need and expand at any time with no downtime and no reconfiguration
  • Ensure rapid data access: Input-output per second (IOPS) can range from terabytes to multiple petabytes
  • Have mixed node and protocol clusters.

In addition, Solidfire provides complete automation, guaranteed quality of service, data assurance (including self-healing drive and node rebuilds) and global efficiency.

Solidfire provides a solid foundation for manipulating and processing the data now being generated in the digital age. And when that foundation is solid, you can make good decisions with confidence – and gain an edge on those who continue to operate in a new world using outdated technology.

Make sure your business is ready. See the infographic here.

The Big Data challenge has an answer in SolidFire

SolidfireIn 2014, about 35% of computer users accessed the cloud, according to the UK Office for National Statistics. That figure has since almost doubled and within three years, 86% of workloads are expected to be processed in cloud data centres.

The quantity of data is doubling every 18 months too and the forecasts are that data production will be 44 times greater in 2020 than it was in 2009. Individuals today create 70% of all data. Enterprises store 80%.

The World Economic Forum has identified six software and services megatrends and the Fourth Industrial Revolution is under way now.

The growth in Big Data is challenging with respect to finding what information is relevant to you, and making sense of it. This is part of the disruption that Big Data is at the heart of.

However, these challenges also present opportunities. For example, data analytics is leading to insights into virtually any field you can think of.

Not long ago a biotech company used data analytics to sort through GBs of data – leading them to isolate 23 optimal genes. And that led to the first gender-specific diagnostic tests for heart disease. Technologies such as SolidFire help to make developments such as these possible. Solidfire offers specific advantages for data storage requirements both now and in the future with the following attributes:

  • Agility: With SolidFire, enterprises can support specific solutions and adapt on the go to multiple workload environments, without affecting the performance of existing applications.
  • Reliability: A key requirement for next generation data centres is repeatable, predictable performance. With SolidFire, businesses can specify and guarantee minimum, maximum, and burst IOPS (input output per second) for individual storage volumes on the fly, independent of capacity.
  • Automation: SolidFire not only has application programme interfaces (APIs) for automating storage management, but also offers automation of every storage function of the array from the API. Data availability is also highly automated.
  • Easy scalability: The Quality of Service (QoS) performance-virtualisation of resources is SolidFire patented. This technology allows businesses to manage storage performance independently from storage capacity. Because of this, SolidFire can deliver predictable storage performance to thousands of applications within a shared infrastructure. This architecture also allows linear scale-out of capacity and performance, as nodes are added. This gives scaling up to 3.4 petabytes of effective capacity and a potential 7.5 million guaranteed IOPS.
  • Redundancy: SolidFire’s architecture does away with sharing of any hardware component in the system. Connectivity between nodes is redundant, so anything in the cluster can fail, and the system will still run. If one or more nodes fail, SolidFire automatically rebuilds redundant data across the other nodes in minutes, restoring full redundancy while maintaining all guaranteed QoS settings.

Remember, without control and the ability to scale, performance is just a Band-Aid!  See a summary of what SolidFire offers by viewing this infographic.

Future-storage solutions must be adaptable and flexible

Blog 4 pic

Business managers used to look at data and ask, “How cheaply can we store this stuff?” Not now, though. Now the question is likely to be, “How fast can we get this processed and get the analysis back?”

There’s more data, and more pressure on making sense out of it all.

Consider that 90% of all data in existence originates from the past two years. There is a 65% growth of data a year, which equals 100% growth every 18 months. In other words, we are talking about exabytes, zettabytes of data, and even yottabytes pretty soon, and we’ve pretty much gone past recognising any real boundaries on capacity.

The vast quantities of data involved with mobile devices, smart sensors, the Internet of Things, connected vehicles and more, demand solutions that go beyond adding rows upon rows of last-generation hardware and software to crunch digits. Performance agility and flexibility now are crucial parts of platform architecture.

In 2009, all-flash storage was thought to be a niche installation that would stay like that. Disk economics was too good to be threatened by another medium.

Seven years later we see that all-flash arrays will replace Tier 1 and 2 computer architecture solutions, and (possibly) Tier 3. The future for disk is going to be archival data.

Fujitsu are there now with SolidFire. Our all-flash array can be customised – in size by scaling out, whether four nodes or 100; in capacity, from 20 terabytes to many petabytes; in speed, from 200k IOPS (input output per second), to millions.

With SolidFire and its Quality of Service (QoS) architecture, both SQL (Structured Query Language) and NoSQL (Not only SQL) database workloads run properly without unnecessary expenditure. That’s because SolidFire allows you to create your system to specifically meet your capacity and performance demands.

At the same time, SolidFire is adaptable and flexible. For example, its Quality of Service controls make it simple to mix and match almost any workload types within the shared infrastructure while maintaining predictable performance to each application.

Administrators can choose to run many instances of one type of workload or run any combination of block storage workloads without compromising performance.

Scalable databases, including read-heavy and write-heavy instances, can go onto SolidFire and be protected with QoS.

Do you need to create dozens of distributed web application instances using a rapid cloning process, and then double the number of workloads quickly and without affecting the performance and availability of the running instances?

Not a problem.

Or you might want to stage a production database to a test/development environment, while it’s running, without slowing the performance of the workload.

Easy as.

To learn more about how SolidFire is the all-rounder you need for your future storage needs, read the white paper.

Revolution causes disruption in information technology

Blog image man

Most people we know are busy with all the usual stuff of daily life – unaware there is a huge revolution going on.

The Fourth Industrial Revolution is right now.

The increasing fusion of the physical and digital worlds at the heart of this revolution can be seen in connected cars, smart homes, increasing types of intelligent devices, growing numbers of sensors, and so on.

This revolution is not a change, it’s a disruption.

In the midst of this disruption, data centres, although still collecting and processing more data than ever, are becoming defined not by hardware, but by software. But they must still have the right hardware. When it comes to computing power, the agile aggregation of performance is vital.

In agile aggregation, development is not linear, with one end result, but happens by advances, stage by stage, all the time. Managers and leaders also need accurate and quick information, and agile business intelligence gives them what they need.

The upshot? Today’s data centres work best with all-modular, virtualised industry-standard servers. Just as well, as new models of data storage are critical to riding out any disruption.

As all-flash arrays become the default option in IT storage, Fujitsu are launching a new era with our flash-first model, with SolidFire and Netapp.

This all-flash array is a solid state storage disk system, meaning it has multiple flash memory drives instead of spinning disk drives. It has no moving parts and can transfer data much faster than traditional electro-mechanical disk drives.

Software-defined systems meet the almost unimaginable demands of hyper-scale storage.

High density will be a feature of servers in hyper-scale setups, saving money and cutting costs. SolidFire is all-modular, so it can meet the needs of small business, yet has hyper-scale functionality, so it can deal with huge demands.

SolidFire also:

  • reduces power consumption
  • increases the dissipation of heat, and
  • offers extreme flexibility and dynamism in network connectivity.

SolidFire’s scale-out architecture, Quality of Service (QoS) capabilities and hardware compatibility-guarantee give our customers what they need. These include:

Scale out: From tens of terabytes to multiple petabytes. Non-disruptive, no downtime scaling.

Complete automation: Comprehensive application programme interfaces and cloud-based monitoring. Instant provisioning. Automatic data distribution and load balancing.

Guaranteed Quality of Service: Independent control of storage performance and capacity. Real-time performance management. Fine-grain QoS settings (faster transactions).

Data assurance: 256-bit encryption at rest. Self-healing drive and node rebuilds. Rapid, space-efficient snapshots and clones. Real-time replication.

Global efficiencies: Inline and post-process compression. Always-on de-duplication (eliminating unnecessary information). No performance impact. Global thin provisioning (total user capacity allocated only as virtual storage; actual physical disk capacity allocated as and when needed).

To learn more about using SolidFire and being ready for the Fourth Industrial Revolution disruption, check out this white paper.

The Real-Time Revolution

Turbo

We already know about Big Data and its ramifications for enabling better business decision making, but what’s really becoming a business game changer is the importance of Fast Data.

This is the emphasis on being able to process information as it comes in – in real time – so enterprises can be aware and take action immediately.

Accelerating operations in real time

One renowned German automaker has a firm focus on producing vehicles capable of reaching impressive speeds, but that philosophy didn’t always translate over to its operations.

The engines constructed by the marque are central to their high-performance reputation but in the past the way in which these were being tested was holding back efficiency, productivity and further development.

The company realised that being able to monitor the testing data in real-time could significantly speed up the process as real-time engine data could be instantly correlated with historical test data to recognise an issue as soon as it arises.

Now that the engine data is available immediately there is no need to wait for an hour-long test to be finished before analysing it and engineers can halt a test at any step in the procedure the minute it exhibits unusual behaviour. That’s led to more engine testing capacity each week, enabling engineers to focus on further refinement of the company’s high-performance engines.

Respond or relapse

This importance on being able to monitor and respond to business opportunities as they happen is quickly becoming the norm for high-performing organisations worldwide. A study of global companies by The Hackett Group found that 70 percent of leading firms had access to financial, customer, and supplier information in near real time or within one day.

Being able to respond in real time, however, requires systems capable of working at break-neck speeds and that’s a capability you can take advantage of with NetApp® All Flash FAS. This flash storage solution delivers low latency and enables your systems to operate up to 20-times faster than disk storage. That can mean the difference between taking action on the spot and winning business, or losing out to your competitors.

Fujitsu New Zealand is the premier NetApp System Services Certified Partner (SSCP) in NZ, providing professionally-delivered, comprehensive support for critical NetApp® infrastructure. Our consulting expertise helps you take full advantage of all of the product features and benefits necessary for your environment. With our assistance, not only does your NetApp infrastructure run smoothly, it’s optimised for performance.

Click here to download our ebook on how All Flash FAS can transform your enterprise

Big Data: What on earth is it?

t

Pramod Singh, Principal Consultant – Big Data at Fujitsu provides an insight into Big Data…

Big Data is one of the leading strategic technology trends in the year 2013. Hence, most of the leading vendors of Information Management are building capabilities around Big Data Some of the leading organizations have started planning to add Big Data to their data warehouse and data integration infrastructure. However many IT leaders and Information Managers still ask the question “what on earth is this”! Is Big Data a completely new concept or an old concept with sugar coating? Continue reading

Human-centric ICT and real-time Insight

Craig BatyCraig Baty our Chief Technology and Innovation Officer provides an insight into some of the lesser known benefits of Big Data strategies, and how Fujitsu is making this real for everyday people in the street….

Fujitsu’s overarching vision is based around the creation of a Human Centric Intelligent Society….the linkage of the “physical world”  and the “digital world”. The physical world is where we live – an environment now saturated with mobile devices and pervasive networks. That domain is backed by a digital world that holds vast resources of information and analytical power.

These two worlds are now being synchronized and exploited to provide previously unimaginable potential by delivering data-driven insight at high speeds, some impacts of this are:

  • Computing ecosystems are bringing new solutions that can be brought to bear on entire industries or across society.
  • Many different types of technology – mobile, network, cloud, sensors, social media and consumer electronics – are aligning into connected architectures to deliver richer and deeper content.
  • Decision-making time is being reduced or eliminated. The resulting analytical responses provide an increasingly clear perspective for greater assurance in decisions.

Big Data and implications of digital awareness of real life

When people think of ‘Big Data” they often think mostly about the processing of massive amounts of information with the aim of analysing this data to unearth nuggets of useful information. However this is only part of the Big Data definition.

“As a mega-trend, its impact will be as big as that of the Internet, the PC, or virtually any breakthrough technology you could name”

Fujitsu’s view is that Big Data is generally unstructured, comes from multiple sources (often from the Cloud), is generated and analysed in real time, and should be used not only to describe a situation, but to enable predictions to be made, and then actions to be  prescribed  based on the predictions. We call this Real Time Insight and it will have huge implications for us and how we live.

As a mega-trend, its impact will be as big as that of the Internet, the PC, or virtually any breakthrough technology you could name. In the near future we might anticipate that:

  • Systems will “sense and respond” rather than merely process transactions and the question “will humans or machines make the decision?” will arise with increasing frequency.
  • Our focus will have switched from reactive to proactive processes (medical treatment, for instance, will focus on maintaining wellbeing rather than on treating illness).
  • Speed of processing and decision making will be everything, and everything will be speeding up.

Big Data in action: Managing Tokyo’s traffic
Take for example the problem of managing Tokyo’s traffic. Tokyo is a huge metropolis with a very large population. Traffic jams and transport disruption is ubiquitous. Based on the concept of applying real-time insight, Fujitsu Japan has recently launched SpatioOwl – a cloud-based intelligent traffic management system. It collects data – masses of data – from an incredibly rich variety of sources. From sensors planted in fleets of vehicles like taxis or hauliers, from roadside sensors that monitor traffic flow, even down to subtle things like the speed that windscreen wipers are moving in the rain. But it also collects data from individuals and communities, from social media and events.

The real value comes from what happens at the back end – in the digital world.  All of this data is presented into a cloud platform, making it available for many different – as-a-service – uses. Fleet and logistics management can use it to route their traffic in the most efficient way. Individuals can use it to get simple reports of traffic. Urban authorities can use it to manage traffic control – in real time. And as we move into the future, a major application will be to link drivers to supply points for electric vehicles. The potential is vast.  Researchers at Fujitsu are using the system to map unsafe areas of the road network – based on braking information. And on another system that smooths supply and demand for the city’s taxis – so that an individual need never wait for a taxi again. For a deeper insight please see the analysis in Fujitsu’s Technology Perspectives.

This is just one example of how we see the Big Data trend playing out to benefit not only corporations and governments, but individuals in the street. For more examples of how Fujitsu is working towards the creation of a Human Centric Intelligent Society, please go to www.technology-perspectives.com and download a free copy of Technology Perspectives, developed by Fujitsu’s Global CTO Community.