Thursday, April 25, 2024
-Advertisement-
Reimagining Public Sector Analytics
Reimagining Public Sector Analytics
HomeNewsOpinionTop 5 BI Trends 2019: Analytics to become more pervasive

Top 5 BI Trends 2019: Analytics to become more pervasive

Follow Tech Observer on Google News
Google News

: Technology is changing faster than most of us can comprehend. And the changes are impacting us at every level – not only in our personal lives and workplaces but in our communities and the world. In fact, technology is changing the very ways society operates and shaping our future on the planet.

Technology is also creating shifts in power. Hundreds of years ago, power resided in land ownership. With the Industrial Revolution, power shifted to manufacturers. In the data age, power is moving to the organizations that hold the information.

Leading thinkers like Matt Turck  and Yuval Noah Hariri  are raising concerns, pointing out that information is being centralized and consolidated into fewer and fewer hands. Behemoths like Google, Amazon, Apple, Alibaba, and Facebook are feeding off their hyperscale data centers, participating in the data and AI race, and upending industry after industry.

Now that information is power, its incumbent upon all of us to establish a level playing field that decentralizes data ownership, empowers the masses, and helps ensure that data is used as a force for democracy, collaboration, innovation, equality, and progress.

Top 5 BI Trends 2019

1) The Multi-Cloud, Hybrid and Edge Continuum

Trend: In 2019, platforms will emerge that can handle multi-cloud, hybrid and edge as a continuum, rather than separate efforts.

The shift to cloud is happening. IT leaders are now increasingly migrating not just their born-in-the cloud data, but also the mission-critical data that runs their business. The promise of on-demand capacity, low-cost storage, and a rich ecosystem of tools is compelling. However, migrating data should also be done with care. Too much centralization of data in one place may risk lock-in, with associated back-end cost implications. It may also remove the ability to be flexible in policy and regulations, like GDPR.

Beyond , managing data in the cloud has its own set of rules, and if not done right the cost, complexity, and risk can bring down the house. The shift from on-premise and legacy data centers should therefore be done at a pace organizations feel comfortable with. The ability to centrally calibrate and distribute to multiple clouds, as well as hybrid on-premise and cloud continuum, are good ways to hedge bets. In addition, Edge computing delivers the decentralized complement to today's hyperscale cloud and legacy data centers, and is often preferred for latency, privacy and security reasons. That should also be brought into the fold, and a post-modern platform should be able to handle distributed data, workloads and usage across multi-cloud, hybrid and edge as a continuum.

2) The “Single View” of All Data is Finally Here

Trend: In 2019, focus will shift from putting data in one place, to attaining one view of the data.

In 2019, focus will shift from simply putting data in one place, to having one view of the data. The ability to have a single view of all data has never been more important than now. Data is coming from all different directions, speeds, and formats, and being able to control that will be one of the key markers for empowerment and success in the data age.

The problem is that historically, it's taken a lot of effort to make that happen. Cumbersome efforts to put all the data in one place (such as the all-encompassing data warehouse or lake) didn't reach the goal, and that's happening again now in the cloud. It's a seemingly impossible feat because there will always be new data coming in, and being able to combine as well as analyze data at the source is what enables the crucial agility needed in a fast-moving world. Historically, this has created data silos and governance problems.

Two massive trends are set to shift the situation, making it possible to get a single view of all data while keeping it where it resides. First, there are different vendors coming together and standardizing data models. This will mean cloud-based data sources especially will have more consistent formats. Second, and more importantly, is the emergence of enterprise data catalogues. Accessible in a hub, data catalogues make it possible to audit the entire distributed data estate, delivering a shop-for-data marketplace experience. The more users share, collaborate and use the hub, the more valuable it becomes to the business. Furthermore, it links the analytics strategy with the enterprise data management strategy.

3) Analytics Everywhere Reshapes Processes

Trend: In 2019, analytics will be more pervasive, and even re-shape business processes.

Embedding analytics into business processes isn't new, but it's now becoming mainstream. Users want analytics in their workflows as it helps make data more actionable and increasingly also real-time. All of this is being fueled by machine learning and AI, which can provide contextualized insights and suggested actions. It's the foundation of “continuous analytics” in which real-time analytics will be gradually integrated within a business operation or device, processing data to prescribe actions in response to business moments and at the edge. In the next five years “intelligent” applications will be ubiquitous.

Furthermore, analytics is starting to re-shape the process. New technologies, like robotic process automation and process mining, look at digital footprints and from that can further automate or re-shape business processes. For example, when a customer places an order for a product online, it will have the ability to automate and re-shape sub-processes including receiving, fulfilling and invoicing the order.

4) The External Ecosystem Accelerates Innovation

Trend: In 2019, external innovation will outpace internal by 2X.

The amount of people that can innovate around a technology inside a company is capped. But with a strong ecosystem, the amount of people that can innovate from the outside is unlimited. Internal innovation has the benefit of tight integration. But those that sit closer to the business problem can be far more effective in providing contextualized business value and driving differentiation in the way they apply analytics. This was previously not possible if you used a closed, generic BI tool.

This is why open platforms with ecosystems, able to connect with partners and customers, will gradually supersede closed ones. In 2019, the market will conclude that open APIs and extensions are a necessity, as innovation from open platforms with ecosystems will outpace those with only internal innovation by a factor of 2x. It's even more powerful with an extension pipeline from external to internal, moving from unsupported to certified and becoming supported “out of the box.” That's the innovation of the ecosystem.

5) Performance Takes Center-Stage Yet Again, as Analytics Scale

Trend: In 2019, performance and scalability will take center-stage in enterprise selection criteria. 

Performance is undervalued when it comes to tool selection – and too often an afterthought. Where query performance is good and latency is low, is where analytic workloads run. If a query takes longer than a few seconds, users lose interest and stop interacting with the data. If it takes more than a few hundred milliseconds, users may not leverage it in a business process or an augmented reality experience. In organizations, as the self-service trend was in its nascence, perhaps performance was overlooked by many because building visualizations on a flat file doesn't take that much horsepower.

But many self-service BI solutions (often referred to as “modern BI”), that seem so cheap up-front fail when it comes time to scaling more data, workloads and people across the enterprise. Performance has also been a bottle-neck for distributed big data at scale and the reason why many Hadoop projects failed to become much more than cheap storage. Breakthroughs have recently been achieved through indexing, caching and pre-preparing very large and distributed datasets.

Now, as companies of all sizes are increasing their adoption of hyperscale data centers, performance will rise in the selection criterion. Some organizations have moved their data back through “re-patriation” because they haven't been seeing strong enough performance. This becomes even more important in an IoT application. More and more workloads will run locally or at the edge to avoid latency. In short, efficient performance will be a deciding factor for how architectures will look – centralized or distributed.

Get the day's headlines from Tech Observer straight in your inbox

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
Dan Sommer
Dan Sommer
Dan Sommer is the global lead for Qlik’s market Intelligence program. Earlier, he worked with Gartner for 10 years as an analyst.
- Advertisement -
EmpowerFest 2024
EmpowerFest 2024
EmpowerFest 2024
EmpowerFest 2024
- Advertisement -EmpowerFest 2024
- Advertisement -Education Sabha
- Advertisement -Veeam
- Advertisement -Reimagining Public Sector Analytics
- Advertisement -ESDS SAP Hana

Subscribe to our Newsletter

83000+ Industry Leaders read it everyday

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
- Advertisement -

Synology launches HD6500, aims to boost data security in India

Taiwanese firm Synology that specialises in network-attached storage (NAS) appliances solutions launched its latest high-density storage server, the HD6500, capable of accommodating up to 4.8 petabytes.

RELATED ARTICLES