The future of intelligent storage

Cat Mules

Cat Mules

Don't let your company fall into the data swamp.

We’re constantly generating data – reading an online paper, watching a YouTube video, tracking our smartwatch, catching an Uber or turning on GPS.

Sure, you’ve heard about kilobytes, megabytes, gigabytes, or even terabytes – the data units that the average person might encounter when sending an email or storing data in a hard drive, for example.

Now data generation has reached even more epic proportions. American-based cloud operating system provider, Domo, reports the amount of data generated every day is in the quintillion range – at 2.5 exabytes. That’s likely equivalent to all the words ever spoken by humans, or a video call lasting 237,823 years.  Domo forecasts that the rate of data generation will continue to accelerate.

So, it’s clearer than ever that data will be the foundation of the business warehouse or platform of the future.

Yet despite the masses of data being created, ideas about what to do with it all are only as good as the technologies we use to process and understand it.

We’re collectively moving away from the traditional model where organisations treat data as an isolated cost centre, meaning every time a data project is launched, there is an “expense” attached to it. These companies are risking slipping into a ‘data swamp’ – with insights that are invisible or unusable in organising or managing the data life cycle.

When you’ve got tons of data at hand but don’t know what to do with it, you’re in what’s known as a “data swamp”. Credit: Network World.

Choosing your data approach – key considerations

If you could invest in strategies that allow data to be used to run your company more efficiently, understand your customers better, make better decisions, would you do it?

Today more than ever, professionals are being asked to argue their case and make decisions based on data. In the Umbrellar network, cloud migrations range from partial to full data lift and shifts. Sunny Lakhiyan, Umbrellar Enterprise Services Manager and former Origin IT Technical Security Architect, says there are some key observations about what’s changing:

1 – Migration strategy and security. 

The abstraction of data into its own layer with applications taking control of the data from the traditional physical and platform layer has highlighted how outdated and ineffective traditional methods are in the world of cloud applications. Newer and better controls are essential to ensuring data is secure.

2 – Cost. 

The decentralisation of storage in the cloud platforms means, multi-cloud and multi-platform implementations become much more common. This presents a new challenge in the need to ensure the data is protected regardless of its location and/or state. With 90 percent of companies using some kind of cloud service, cost governance via reliable service managers to manage costs and overall resource consumption is now essential.

3 – Performance. 

Performance requirements are and have evolved in recent times enabling data analytics to become a key part of the overall data strategy. Where focus used to be on ensuring availability and performance SLA’s, this has evolved to require insights from data platforms to enable better business decision making. Data performance is tracked by trends and patterns making data analytics a crucial part of the data strategy.

Data change is company–wide: LinkedIn’s business case for data

The data needs of an organisation will always depend on its business objectives and values.

Futurist Bernard Marr’s book, Big Data in Practice, examines LinkedIn’s data strategy. He discusses  the specific metrics collected, insights shared with users and changes to organisational culture.

Big data is key to how the largest professional network in the world works. LinkedIn tracks every click, page view and interactions of their – at a recent count – 414 million members. Marr claims this is necessary, “to ensure their site remains an essential tool for busy professionals, helping them become more productive and successful”.

LinkedIn also uses machine learning techniques to drive better suggestions for its users, and ultimately create the networks that work best for them, such as the “people you may know” feature.

Marr offers a scenario to illustrate this point: “Say LinkedIn regularly gave you suggestions for people you may know who work at Company A (which you worked at eight years ago) and Company B (which you worked at two years ago). If you almost never click on the profiles of people from Company A but regularly check out the suggestions from Company B, LinkedIn will prioritize Company B in their suggestions going forward.”

LinkedIn also uses real-time stream processing to provide the most updated information on the news feeds, driving information about who started a new job, or the useful articles that contacts have posted, liked and shared.

Culture changes with data, too. Marr explains that LinkedIn has built up an impressive team of data scientists and created a culture where data insights are deeply integrated and learning is encouraged, and where the data science team’s decisions feed directly into the company CFO and engineering department.

These days there’s an algorithm for just about anything. AI can be used to clean and augment incoming data. It can be used to run complex algorithms to correlate different sources of information to detect complex fraud.

With the masses of data we could have at our fingertips, getting a clear gauge on where your company is now and where it wants to be could be a powerful first step.

Cat Mules

Cat Mules

Umbrellar's Digital Journalist, coming from a background in tech reporting and research. Cat's inspired by the epic potential of tech and helping kiwi innovators share their success stories.

Umbrellar

See Profile

The Cloud, Done Right. Umbrellar is New Zealand’s only dedicated Azure & Azure Stack Managed Services specialist. That’s why successful New Zealand companies of all sizes choose us to transform their businesses.

Data, AI, BI & ML

See Profiles

Artificial Intelligence and Machine Learning are the terms of computer science.Artificial Intelligence : The word Artificial Intelligence comprises of two words “Artificial” and “Intelligence”. Artificial refers to something which is made by human or non natural thing and Intelligence means ability to understand or think. There is a misconception that Artificial Intelligence is a system, but it is not a system. AI is implemented in the system. There can be so many definition of AI, one definition can be “It is the study of how to train the computers so that computers can do things which at present human can do better.” Therefore it is an intelligence where we want to add all the capabilities to machine that human contain.Machine Learning : Machine Learning is the learning in which machine can learn by its own without being explicitly programmed. It is an application of AI that provide system the ability to automatically learn and improve from experience. Here we can generate a program by integrating input and output of that program. One of the simple definition of the Machine Learning is “Machine Learning is said to learn from experience E w.r.t some class of task T and a performance measure P if learners performance at the task in the class as measured by P improves with experiences.”

You might also like

Our Vendors

Subscribe to
The future of intelligent storage

Get the latest news content in your inbox each week

Search