Top 5 IT trends in 2023 you should be aware of
A world of global instability is the world you and I live in today. Pandemic? War? Inflation? Earthquake? Tsunami? Geopolitical changes? Nuclear war? Can anyone say with certainty what will happen tomorrow? In such a time of turbulence, traditional patterns of doing business break down; flexibility, originality, spontaneity, and creativity are required. From this point of view, let's try to look at what is happening in the field of forming trends in the IT industry.
1. Pushing the boundaries even further
The trend includes all technologies that simulate reality, from virtual and augmented reality to mixed reality and everything in between. Nowadays, it is a critical technological trend, as the modern world increasingly requires going beyond certain frameworks and boundaries. It started the most active during the pandemic and will not stop.
Metauniverse. Many players in the technology market are convinced that it has significant economic prospects. Analysts predict a capitalization of $800 million by 2025 and $2.5 trillion by 2030. According to the developers, in the metauniverse, one can not only have fun but also work, earn, get an education, develop telemedicine, online trade, tourism, etc.
2. Artificial intelligence is in the humans
This trend involves the improvement of human abilities and skills with the help of computer technologies. Some skeptics oppose the conversion of people into cyborgs, but some supporters believe that such technologies will open up new opportunities for people.
Conventional prostheses as we know them have existed for many years, and over time, technology has made it possible to make them more and more comfortable for people. In addition to that, for example, neuroprosthetics completely changes the idea of classical prosthetics because now a person can move fingers on a prosthesis only by the power of thought.
A vivid example: the startup company Neuralink, one of the founders of which is Elon Musk, announced that it will be able to implant a neurochip in the human brain this spring, capable of ensuring the functioning of the "brain-computer" interface. A tiny chip with a miniature battery is connected to certain areas of the cerebral cortex by a web of more than a thousand electrodes with a diameter of several microns. A special robotic surgeon was created for its implantation.
These neural chips will be integrated into the human brain to eliminate the effects of certain brain diseases, expand the memory, treat paralysis, blindness, depression, Parkinson's disease, and more optimally control complex systems, etc. The potential possibilities to mentally play video games, summon Tesla cars, or download and play back memories are suggested.
Another example is injectable chips: they operate as self-contained systems with no wires or auxiliary devices. Ultrasound is used to power and communicate with the device. For example, such a chip can be used as a medical device to collect biological information about one or another part of the body, depending on where it was implanted. In the future, this device could become a breakthrough in the development of wireless miniature medical implants for sensory information collection, say, for medical purposes.
3. Cyber security
Naturally, cyber security has already become one of the most promising directions in IT. And not only at the private or corporate level but also at the state level, especially after the russian attack on Ukraine, which was preceded by the hacking of infrastructure facilities, state sector websites, and resources of large institutions and banks.
The traditional fragmented approach to data security is already becoming a thing of the past, giving way to a comprehensive approach, the so-called "cyber security grid". As defined by Gartner, a cybersecurity mesh is a distributed architectural approach to scalable, flexible, and robust cyber control. It's a more modular approach where IT departments can create smaller perimeters within the system that protect access points and allow network managers to give users different levels of access to the company data and assets. That makes it difficult for hackers to acquire access to the entire system because 34% of data leaks and hacks occur within the network itself.
According to Gartner's estimates, the companies which will implement the cybersecurity network architecture by 2024 will reduce financial losses due to cyber-attacks by an average of 90%.
But everything that one person creates, another can break. Therefore, attention is now focused on the capabilities of artificial intelligence, which should create a cyber defense system impenetrable to a human hacker.
4. Quantum computing is for everyone
It's no secret that Microsoft, Intel, IBM, AWS, and Google have their quantum computers. They are huge, and complex in design and configuration but, according to scientists, they are capable of surpassing any supercomputer. However, the Chinese company SpinQ Technology went further, offering anyone willing to buy their quantum PC cheaply – $8,900. With Gemini Mini, one can understand the principle of quantum computing.
The appearance of this device demonstrates this significant technological trend – quantum computers, telephones, the Internet, and communication are just around the corner.
Why do we need quantum computing? It all goes to the analysis of numerous options to find the best possible solution: for example, which securities are more profitable to invest in, which combination of drugs will work better, which partner is better to do business with, etc.
Let's say you want to seat ten people at the dinner table so that the evening goes as well as possible for everyone. There are more than three million combinations to choose from. A classic computer can also help you find the optimal solution. But combinatorial calculation will take a week or a month because it will go through all possible combinations one by one. And quantum computers analyze the combinations simultaneously, and we get the result much faster – say, within a minute. Feel the difference?
5. Digital Twin
A digital twin is a complex computer model equivalent to a physical object. It allows one to make strategic decisions for the company's work or production based on computer simulations without the need for actual tests. According to logistics giant DHL, the market for digital twins will grow at an explosive rate (almost 40% annually) in the coming years and will reach $26 billion by 2025.
A digital twin of a factory, for example, will allow the creation of different optimization models of the enterprise, from which the founders will then choose the best one to increase profits. They will also be able to predict possible errors, reduce costs, and improve quality control and preventive maintenance.
This technology allows one to simulate work processes, that is, to create simulators of rescue missions, terrorist threats, and air, ground, and sea operations, where representatives of law enforcement agencies and military departments can hone their skills, without once again chasing equipment, the operation and maintenance of which costs a lot of money. The following technologies are also in demand in cyber security: a digital twin can simulate a hacker attack on a company or institution, which will reveal security gaps and other dangerous moments.
The technology of digital twins has not yet reached the mass market. But there are also more accessible tools for business – they are just as visual but devoid of complex functionality and mathematical models.
For example, one of these simplified models is called a "digital shadow" – it provides only a partial display of the data of a physical object (weight, geolocation, availability in the warehouse). Such tools are relatively simple and provide workflow visualization, convenient monitoring, and data analytics.
By the way, there are predictions that in the next ten years, the first versions of thinking digital twins of people will appear in the world. It will be an exact copy of a person in the physical world, which has the goal of helping the "real version" of itself and providing feedback to the "real version". Initially, such twins were no more than 3D computer models, but artificial intelligence combined with the so-called "Internet of Things" (the concept of a network of data transmission between physical objects) suggests that now users can digitally create "something" which constantly learns and helps to improve the "original".
The president and chief analyst of Enderle Group, Rob Enderle, admitted in an interview with the BBC that "digital people" will be one of the defining postulates of the metauniverse era, which is already quite close. In his opinion, the creation of a digital human twin will lead to the fact that companies will begin to hire them to work in order not to pay salaries to people, because a thinking copy of ourselves can be incredibly useful and profitable for employers. Therefore, their very occurrence is associated with a huge number of reflections and ethical considerations.