Featured
- Get link
- Other Apps
The Ups and Downs of Data Latency
What is latency?
Usually when we hear the word "latency" it refers
to streaming video, downloading music, or connecting to a mobile phone. While
latency issues in these cases can be frustrating or inconvenient, low latency
in HPC and data communications can be beneficial or disruptive to your
business. Latency is clear as the time it takes for an end user to retrieve
data from a source. Note that dormancy should not be confused with throughput.
Latency is related to the time it takes for data to reach the end user, not to the amount of data that can be transferred over the connection. Latency can come in many forms, each of which is appropriate for any business.
Sometimes the data is not updated regularly. As a rule, these data can be entered into the database once, and they practically do not change.
Example: contact information for a supplier and a customer.
This type of data is usually saved only once, and business success does not
depend on how timely the data is updated.
Upcoming data is information that is updated at regular
intervals. Unlike real-time data, near-term data is recorded “as timely as
needed” rather than continuously. Real-time data is more cost-effective and
easier to manage than real-time data.
Example: monthly sales report or daily cash register. This
information is recorded and sent at regular intervals, and the receipt of this
information does not have to be presented in real time.
Real-time data is what we associate with advanced computing.
This is data that is immediately available in the database as soon as business activity occurs, with zero or very little latency. Real-time data is the most
expensive and most difficult to obtain. However, it provides an immediate
return on investment when you have the right devices and processes.
Layers affecting latency
Successful latency management depends on a robust
infrastructure made up of three layers:
Edge - the source from which the data, intelligence
and / or power of the computer is collected
Gateway - where data is moved and stored until it is
centralized in the cloud or on a high-performance platform.
A data center is a physical structure or premises where
cloud and edge computing platforms are stored.
The functionality of these three layers is critical to application
performance and end-user experience.
Data latency, cloud and edge computing
In a typical cloud environment, data processing takes place
in a centralized data warehouse. As a result, latency in the cloud is less
predictable and difficult to measure. Your services are more prone to latency
issues because moving applications to the cloud does not address the underlying
problem of distance between cloud services and users. Factors affecting latency
include the number ofhandoffs between individual satellites or the number of
router hops between source and destination. In addition, if virtual machines
(VMs) are on different networks, it can also cause delays in service provision.
Log in to Edge Computing
Edge computing can reduce latency issues in the cloud
because low data latency is the foundation of edge computing. Edge computing
happens close to the physical place where data is processed and uses Industrial
Internet of Things (IIoT) devices such as smart sensors to collect and analyze
data. These devices can then make decisions in real time. Real-time edge
analytic can help find correlations, hidden patterns, and other valuable
insight in organizations. Because data is available as soon as business
activity occurs, it is incredibly useful for mission-critical processes.
- Get link
- Other Apps