If your organization is having trouble getting value from its data you aren’t alone, and big data firm Syncsort thinks it knows what’s wrong.
While more organizations are prioritizing big data analytics in IT initiatives, many are facing a series of problems connected to the gathering and harvesting of this information, according to a Tuesday report from big data firm Syncsort.
Big data and its effectiveness are only one element of the overall findings relating to IT initiatives in general, which also include a list of the most common business initiatives for 2019. Leading the pack for what companies plan to do this year is cloud/hybrid computing, modernizing infrastructure, data governance, and advanced/predictive analytics.
Interestingly enough, while 68% of respondents said their data analytics efforts are hampered by data siloing only 25% say that data analytics is the top of their priority list for 2019.
Why isn’t big data more effective?
One of the conclusions reached by the report was that “professionals are grappling with delivering data to business users,” and it cites data siloing as a major reason for that difficulty.
Siloed data refers to any information that is cut off from the rest of an organization, thereby making it difficult or impossible to account for it when trying to analyze it at the enterprise level.
SEE: Big data policy (Tech Pro Research)
Despite problems with accessing data, only 38% of respondents say they plan to prioritize improving access to data for decision making in 2019. Instead of improving data access, most businesses are planning IT initiatives that focus on increasing efficiency, improving customer experience, and reducing cost, all of which could have a direct effect on the effectiveness of big data, for better or for worse.
With that in mind, the responses given to the question of why enterprises are ineffective at getting big data insights are a bit disheartening: 53% say their team lacks the IT skills or staff to work on extracting meaningful insights from data, 50% say they lack the tools to feed downstream apps with the right data at the right time, 44% simply lack the time to sort through data, and 31% say their organization hasn’t invested enough in analytics platforms.
It’s possible that business and IT goals aren’t the entire picture, especially considering the lack of data lake adoption (for those unfamiliar with data lakes, think of them as the opposite of siloing data: Instead of keeping it cordoned off in siloes all an organization’s raw data is tossed together in one massive repository and sorted through later).
Only 9% of respondents say they have been using data lakes for five or more years, while 24% have less than two years of use, and 23% are still evaluating whether to use a data lake at all.
The lack of depth in non-siloed enterprise data can hamper results, making even the most thorough analysis limited by a lack of history.
What can enterprises do to improve the usefulness of their data?
Enterprises that want to improve their ability to analyze the mass amount of data being produced in the modern world need to turn to the cloud or hybrid cloud, Syncsort CTO Tendü Yoğurtçu said in a press release.
“With the gravity of data shifting, organizations are trying to take advantage of the cloud’s elasticity and gain the ability to analyze and deliver trusted data into application pipelines as quickly as possible,” Yoğurtçu said in the release. “These are the precursors to improving data accessibility and taking advantage of the emerging technologies, like machine learning and streaming analytics, that will help deliver more value out of data.” .
As mentioned above, this is only one of the major takeaways from Syncsort’s report. You can find it in its entirety on Syncsort’s website.