Friday, March 29, 2024
-Advertisement-
Reimagining Public Sector Analytics
Reimagining Public Sector Analytics
HomeNewsEnterprise ITTop 10 Technology Trends for 2019: AI, Autonomous Things, Digital Twins, Digital Ethics to dominate

Top 10 Technology Trends for 2019: AI, Autonomous Things, Digital Twins, Digital Ethics to dominate

Follow Tech Observer on Google News
Google News

As we move towards the new year, the stories and analyses on the top 10 technology trends for 2019 will possibly be the buzzword till mid of January. Taking the lead, a week back, the technology research firm Gartner presented its version of top 10 top “strategic” technology trends for 2019. If you closely look at what Gartner has predicted in last two-year, more or less similar technology with new jargon have made it to the new list. Here are the top 10 technology trends that Gartner says will dominate 2019.

Autonomous Things

Autonomous things, such as robots, drones and autonomous vehicles, use to automate functions previously performed by humans. Their automation goes beyond the automation provided by rigid programming models and they exploit AI to deliver advanced behaviors that interact more naturally with their surroundings and with people.

“As autonomous things proliferate, we expect a shift from stand-alone intelligent things to a swarm of collaborative intelligent things, with multiple devices working together, either independently of people or with human input,” said David Cearley, vice president and Gartner Fellow. “For example, if a drone examined a large field and found that it was ready for harvesting, it could dispatch an “autonomous harvester.” Or in the delivery market, the most effective solution may be to use an autonomous vehicle to move packages to the target area. Robots and drones on board the vehicle could then ensure final delivery of the package.”

Augmented Analytics

Augmented analytics focuses on a specific area of augmented intelligence, using machine learning (ML) to transform how analytics content is developed, consumed and shared. Augmented analytics capabilities will advance rapidly to mainstream adoption, as a key feature of data preparation, data management, modern analytics, business process management, process mining and data science platforms. Automated insights from augmented analytics will also be embedded in enterprise applications — for example, those of the HR, finance, sales, marketing, customer service, procurement and asset management departments — to optimize the decisions and actions of all employees within their context, not just those of analysts and data scientists. Augmented analytics automates the process of data preparation, insight generation and insight visualization, eliminating the need for professional data scientists in many situations.

Also Read | Artificial Intelligence and Blockchain can help banks to be more relevant in fintech age

“This will lead to citizen data science, an emerging set of capabilities and practices that enables users whose main job is outside the field of statistics and analytics to extract predictive and prescriptive insights from data,” said Cearley. “Through 2020, the number of citizen data scientists will grow five times faster than the number of expert data scientists. Organizations can use citizen data scientists to fill the data science and machine learning talent gap caused by the shortage and high cost of data scientists.”

AI-Driven Development

The market is rapidly shifting from an approach in which professional data scientists must partner with application developers to create most AI-enhanced solutions to a model in which the professional developer can operate alone using predefined models delivered as a service. This provides the developer with an ecosystem of AI algorithms and models, as well as development tools tailored to integrating AI capabilities and models into a solution. Another level of opportunity for professional application development arises as AI is applied to the development process itself to automate various data science, application development and testing functions. By 2022, at least 40 percent of new application development projects will have AI co-developers on their team.

“Ultimately, highly advanced AI-powered development environments automating both functional and nonfunctional aspects of applications will give rise to a new age of the ‘citizen application developer' where nonprofessionals will be able to use AI-driven tools to automatically generate new solutions. Tools that enable non-professionals to generate applications without coding are not new, but we expect that AI-powered systems will drive a new level of flexibility,” said Cearley.

Also Read Siemens and Bentley Systems launch PlantSight digital twins cloud services

Digital Twins

A digital twin refers to the digital representation of a real-world entity or system. By 2020, Gartner estimates there will be more than 20 billion connected sensors and endpoints and will exist for potentially billions of things. Organizations will implement digital twins simply at first. They will evolve them over time, improving their ability to collect and visualize the right data, apply the right analytics and rules, and respond effectively to business objectives.

“One aspect of the digital twin evolution that moves beyond IoT will be enterprises implementing digital twins of their organizations (DTOs). A DTO is a dynamic software model that relies on operational or other data to understand how an organization operationalizes its business model, connects with its current state, deploys resources and responds to changes to deliver expected customer value,” said Cearley. “DTOs help drive efficiencies in business processes, as well as create more flexible, dynamic and responsive processes that can potentially react to changing conditions automatically.”

Empowered Edge

The edge refers to endpoint devices used by people or embedded in the world around us. Edge computing describes a computing topology in which information processing, and content collection and delivery, are placed closer to these endpoints. It tries to keep the traffic and processing local, with the goal being to reduce traffic and latency.

In the near term, the edge is being driven by IoT and the need keep the processing close to the end rather than on a centralized cloud server. However, rather than create a new architecture, cloud computing and edge computing will evolve as complementary models with cloud services being managed as a centralized service executing, not only on centralized servers but in distributed servers on-premises and on the edge devices themselves.

Over the next five years, specialized AI chips, along with greater processing power, storage and other advanced capabilities, will be added to a wider array of edge devices. The extreme heterogeneity of this embedded IoT world and the long life cycles of assets such as industrial systems will create significant management challenges. Longer term, as 5G matures, the expanding edge computing environment will have more robust communication back to centralized services. 5G provides lower latency, higher bandwidth, and (very importantly for edge) a dramatic increase in the number of nodes (edge endoints) per square km.

Immersive Experience

Conversational platforms are changing the way in which people interact with the digital world. Virtual reality (VR), augmented reality (AR) and mixed reality (MR) are changing the way in which people perceive the digital world. This combined shift in perception and interaction models leads to the future immersive user experience.

“Over time, we will shift from thinking about individual devices and fragmented user interface (UI) technologies to a multichannel and multimodal experience. The multimodal experience will connect people with the digital world across hundreds of edge devices that surround them, including traditional computing devices, wearables, automobiles, environmental sensors and consumer appliances,” said Cearley. “The multichannel experience will use all human senses as well as advanced computer senses (such as heat, humidity and radar) across these multimodal devices. This multi-experience environment will create an ambient experience in which the spaces that surround us define “the computer” rather than the individual devices. In effect, the environment is the computer.”

Blockchain

, a type of distributed ledger, promises to reshape industries by enabling trust, providing transparency and reducing friction across business ecosystems potentially lowering costs, reducing transaction settlement times and improving cash flow. Today, trust is placed in banks, clearinghouses, governments and many other institutions as central authorities with the “single version of the truth” maintained securely in their databases. The centralized trust model adds delays and friction costs (commissions, fees and the time value of money) to transactions. Blockchain provides an alternative trust mode and removes the need for central authorities in arbitrating transactions.

”Current blockchain technologies and concepts are immature, poorly understood and unproven in mission-critical, at-scale business operations. This is particularly so with the complex elements that support more sophisticated scenarios,” said Cearley. “Despite the challenges, the significant potential for disruption means CIOs and IT leaders should begin evaluating blockchain, even if they don't aggressively adopt the technologies in the next few years.”

Many blockchain initiatives today do not implement all of the attributes of blockchain — for example, a highly distributed database. These blockchain-inspired solutions are positioned as a means to achieve operational efficiency by automating business processes, or by digitizing records. They have the potential to enhance the sharing of information among known entities, as well as improving opportunities for tracking and tracing physical and digital assets. However, these approaches miss the value of true blockchain disruption and may increase vendor lock-in. Organizations choosing this option should understand the limitations and be prepared to move to complete blockchain solutions over time and that the same outcomes may be achieved with more efficient and tuned use of existing non-blockchain technologies.

Also ReadTop 5 tips to improve your enterprise cybersecurity

Smart Spaces

A smart space is a physical or digital environment in which humans and technology-enabled systems interact in an increasingly open, connected, coordinated and intelligent ecosystems. Multiple elements — including people, processes, services and things — come together in a smart space to create a more immersive, interactive and automated experience for a target set of people and industry scenarios.

“This trend has been coalescing for some time around elements such as smart cities, digital workplaces, smart homes and connected factories. We believe the market is entering a period of accelerated delivery of robust smart spaces with technology becoming an integral part of our daily lives, whether as employees, customers, consumers, community members or citizens,” said Cearley.

Digital Ethics and Privacy

is a growing concern for individuals, organizations and governments. People are increasingly concerned about how their personal information is being used by organizations in both the public and private sector, and the backlash will only increase for organizations that are not proactively addressing these concerns.

“Any discussion on privacy must be grounded in the broader topic of digital ethics and the trust of your customers, constituents and employees. While privacy and security are foundational components in building trust, trust is actually about more than just these components,” said Cearley. “Trust is the acceptance of the truth of a statement without evidence or investigation. Ultimately an organization's position on privacy must be driven by its broader position on ethics and trust. Shifting from privacy to ethics moves the conversation beyond ‘are we compliant' toward ‘are we doing the right thing.'”

Quantum Computing

(QC) is a type of nonclassical computing that operates on the quantum state of subatomic particles (for example, electrons and ions) that represent information as elements denoted as quantum bits (qubits). The parallel execution and exponential scalability of quantum computers means they excel with problems too complex for a traditional approach or where a traditional algorithm would take too long to find a solution. Industries such as automotive, financial, insurance, pharmaceuticals, military and research organizations have the most to gain from the advancements in QC. In the pharmaceutical industry, for example, QC could be used to model molecular interactions at atomic levels to accelerate time to market for new cancer-treating drugs or QC could accelerate and more accurately predict the interaction of proteins leading to new pharmaceutical methodologies.

“CIOs and IT leaders should start planning for QC by increasing understanding and how it can apply to real-world business problems. Learn while the technology is still in the emerging state. Identify real-world problems where QC has potential and consider the possible impact on security,” said Cearley. “But don't believe the hype that it will revolutionize things in the next few years. Most organizations should learn about and monitor QC through 2022 and perhaps exploit it from 2023 or 2025.”

Get the day's headlines from Tech Observer straight in your inbox

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
Tech Observer Desk
Tech Observer Desk
Tech Observer Desk at TechObserver.in is a team of technology reporters led by a senior editor who brings latest updates and developments from the world of technology.
- Advertisement -
Reimagining Public Sector Analytics
Reimagining Public Sector Analytics
- Advertisement -Veeam
- Advertisement -Reimagining Public Sector Analytics
- Advertisement -ESDS SAP Hana

Subscribe to our Newsletter

83000+ Industry Leaders read it everyday

By subscribing you agree to our Privacy Policy, T&C and consent to receive newsletters and other important communications.
- Advertisement -

How AI power struggle opens up new frontier in global geopolitics

The far-reaching influence AI has on information processing, national security, military operations, the economy, and strategic decision-making is set to reshape the geopolitical landscape and redefine the power dynamics between nations.

RELATED ARTICLES