
Understanding big data, and utilising it can enable organisations to quickly identify actionable insights in real time, and enable employees to build meaningful solutions to business-critical issues and opportunities.
The explosion of activity in the big data analytics sector is both undeniable and understandable.
Done well, big data can help companies realise valuable data and business insights that translate into business value.
The Royal Bank of Scotland, for example, has used big data analytics to underpin a strategy it calls “personology.”
This strategy is about delivering a more personable and personalised customer experience, using data to better understand, anticipate and service customer needs and queries.
This data-driven approach towards user experience has boosted customer loyalty and helped to improve activity and visibility across loans and insurance.
Supporting Big Data Success
However, simply investing money in technology is not enough to deliver big data success. Many big data projects have failed and usually for the same few reasons.
In their rush to jump on the big data bandwagon, many companies ignore the three key pillars that support successful big data analytics in business:
Desirability to define the problem that the business is trying to solve.
Feasibility in identifying the tools and skills resources to address the problem.
Viability in quantifying the business value of the problem for the enterprise.
For many organisations, ensuring these pillars are not overlooked requires a clear methodology and technical approach.
A design thinking methodology, leveraging open source and techniques like AI and automation can enable organisations to mine a range of big data assets in order to identify and leverage the valuable, actionable data within both structured and unstructured data sets.
At its core, big data requires at least three things to work: gathering disparate sources, storing and gaining insights from the data.
The problems of gathering and storing data has largely been solved (though both still present challenges).
Extracting useful signals from the noise in almost real-time, however, remains difficult.
Finding useful information from data stores is difficult for a lot of reasons, but the biggest problem is more fundamental and more common problem than most organizations realise: companies have not scoped out at the start what they intend to do with the data they are gathering.
A design thinking methodology can help in this regard.
The map – design thinking
Before starting on a big data journey we need to know the minimum viable problem that needs to be addressed.
For this, design thinking provides a critical method to getting to the root of a known problem, identifying an as yet unrecognised problem, or both.
It does this while staying as close to business reality as possible.
Essentially, it ensures that businesses focus on the right problems while developing viable solutions.
Discovering and defining problems is the most important part of any big data journey.
The reason is simple: design thinking helps to focus businesses on addressing customer pain points by looking for issues that aren’t always obvious.
Once the pain points that need addressing are identified, a company can define what data it needs to gather.
By focusing data on a particular issue, we can discover new actionable insights that we can execute on.
Design thinking requires a large culture shift for many enterprises, but offers significant benefits in return. It encourages teams to address challenges as a whole, rather than fixing individual problems in isolation.
The tools – data lakes and open source
Data lakes help to maintain data from various sources (internal, external and public) at the lowest granularity in a cost effective manner — from structured and semi-structured to unstructured — until they’re needed.
Anyone with data manipulation skills can access and use data within a lake as they need it, regardless of origin.
All data sources are accessed equally within a lake structure.
The structural flexibility of data lakes makes the incremental cost of adding new use cases marginal.
However, data lakes are not a panacea for big data issues. Additional tools are needed to channel them into something useful. That’s where open source tools come in.
Open source big data tools offer a variety of capabilities, ranging from insights and forecasting to predictive analytics and more, at an affordable cost.
These tools can also help to codify and automate operations — lessening the need for human intervention and processing at the most menial level. This frees organisations to identify better insights and make better use of the time invested in data analysis.
Solving even small problems can have a substantial bottom-line impact for a business. In one case, a 1% efficiency improvement delivered a $200 million return on investment for a large business.
The muscle – automation
Automation of data management can be transformational in the enterprise. Eliminating low-level manual processes frees up people resources, amplifying human potential to deliver more value and creativity further up the value chain.
Automation helps companies analyse data — and spot anomalies with greater accuracy and at a faster rate — in real time. Combined with machine learning, automation can grow and adapt, refining itself to deliver further efficiencies and insights.
However, automation doesn’t happen by itself.
Data sources needs to drive automation. That invariably means breaking data out of existing silos so it can be fed into an automation engine, providing the engine with a complete view of the entire relevant data pile, rather than just a segment or snapshot.
Employees, too, will need access to data in the round so they can recognise contextual bias in data capture, among other things that require a broader view.
Living Up to the Big Data Dreams
The effort is worth it.
Automation, data lakes, open source tools and design thinking all play critical parts in any big data journey.
Each component represents a fundamental part of the process of extracting maximum value from data assets.
They enable organisations to quickly identify actionable insights in real time and enable employees to build meaningful solutions to business-critical issues and opportunities.
And that’s the big data dream: finding those hard-to-spot problems, solving them and automating them to free up time for bigger things.
This article was originally published on www.information-age.com and can be viewed in full


Archive
- October 2024(44)
- September 2024(94)
- August 2024(100)
- July 2024(99)
- June 2024(126)
- May 2024(155)
- April 2024(123)
- March 2024(112)
- February 2024(109)
- January 2024(95)
- December 2023(56)
- November 2023(86)
- October 2023(97)
- September 2023(89)
- August 2023(101)
- July 2023(104)
- June 2023(113)
- May 2023(103)
- April 2023(93)
- March 2023(129)
- February 2023(77)
- January 2023(91)
- December 2022(90)
- November 2022(125)
- October 2022(117)
- September 2022(137)
- August 2022(119)
- July 2022(99)
- June 2022(128)
- May 2022(112)
- April 2022(108)
- March 2022(121)
- February 2022(93)
- January 2022(110)
- December 2021(92)
- November 2021(107)
- October 2021(101)
- September 2021(81)
- August 2021(74)
- July 2021(78)
- June 2021(92)
- May 2021(67)
- April 2021(79)
- March 2021(79)
- February 2021(58)
- January 2021(55)
- December 2020(56)
- November 2020(59)
- October 2020(78)
- September 2020(72)
- August 2020(64)
- July 2020(71)
- June 2020(74)
- May 2020(50)
- April 2020(71)
- March 2020(71)
- February 2020(58)
- January 2020(62)
- December 2019(57)
- November 2019(64)
- October 2019(25)
- September 2019(24)
- August 2019(14)
- July 2019(23)
- June 2019(54)
- May 2019(82)
- April 2019(76)
- March 2019(71)
- February 2019(67)
- January 2019(75)
- December 2018(44)
- November 2018(47)
- October 2018(74)
- September 2018(54)
- August 2018(61)
- July 2018(72)
- June 2018(62)
- May 2018(62)
- April 2018(73)
- March 2018(76)
- February 2018(8)
- January 2018(7)
- December 2017(6)
- November 2017(8)
- October 2017(3)
- September 2017(4)
- August 2017(4)
- July 2017(2)
- June 2017(5)
- May 2017(6)
- April 2017(11)
- March 2017(8)
- February 2017(16)
- January 2017(10)
- December 2016(12)
- November 2016(20)
- October 2016(7)
- September 2016(102)
- August 2016(168)
- July 2016(141)
- June 2016(149)
- May 2016(117)
- April 2016(59)
- March 2016(85)
- February 2016(153)
- December 2015(150)