Data Analytics
Data Analytics
Data Analytics
Example:
A factory that uses machine health monitoring on its equipment must be notified of a possible
equipment malfunction in order for the operator to intervene in time. As a result, it's critical for
system architects, data engineers, and security architects to avoid potential stumbling blocks and
effectively deploy real-time streaming platforms. IT teams will be able to use the framework and
begin extracting information without delay if the above issues are addressed.
Task 2
In-memory processing
In-memory processing is the act of acting on data that is completely stored in a computer's
memory (e.g., in RAM). Other data processing methods, on the other hand, rely on reading and
writing data to and from slower media such as disc drives. In-memory processing is most often
associated with large-scale environments in which many computers are pooled together so that
their combined RAM can be used as a large and fast storage medium. Large data sets can be
processed all at once because the storage exists as a single large allocation of RAM, rather than
processing data sets that only fit into the RAM of a single device.
In-memory processing relies solely on data stored in RAM and eliminates all sluggish data
accesses. Latency, which is normal when accessing hard disc drives or SSDs, has no effect on
overall processing performance. Code running on one or more computers handles both the
processing work and the data in memory, and in the case of several computers, the software
divides the processing work into smaller tasks that are assigned to each computer in parallel. The
technology known as in-memory data grids is often used for in-memory processing (IMDG).
Hazelcast IMDG is one such example, which allows users to run complex data processing jobs
across a cluster of hardware servers while maintaining extreme speed.
Data Stream Processing
Streaming was a relatively new idea at the time. Streaming computing deals with data sources at
first. A data stream is a continuous flow of data that changes frequently and loses validity after a
short period of time. This could include transactional data, data from IoT devices, hardware
sensors, and so on.
Data sources cannot be divided into batches because they have no beginning or end. As a result,
there is no time for the data to be stored and processed. Rather, data sources are processed in real
time.
Complex event processing, also known as event, stream, or event stream processing, is the use of
technology to query data prior to storing it in a database or, in some cases, without ever storing
it. Complex event processing is an operational method for aggregating a large amount of data
and identifying and analyzing cause-and-effect relationships between events in real time. CEP
compares constantly incoming events to a pattern, giving you insight into what's going on and
allowing you to take constructive action.
Complex incidents are often linked to critical business events [such as opportunities or threats],
implying that they will be responded to in real-time, or at least as close to real-time as possible.
Task 3
Example 1
Dashboards of this kind may include a variety of features, including a customizable interface,
some degree of interactivity, and the ability to pull data in real-time from multiple sources. They
simplify data analysis for our brain by allowing users to imagine otherwise complicated and
heavy raw data and they provide readers with an at-a-glance rundown of the past, present, and
possible permutations.
Example 2:
The style is consistent, with many shades of blue that don't overlap, and the graphs and figures
aren't cluttered and follow a straightforward visual order, making every marketing KPI
understandable. The dashboard's message is clear at a glance, and the alignment and white space
give the eyes a break from the struggle to get to the point.
Example 3:
Our sales dashboard, one of the best business intelligence dashboards for profitability, is focused
on helping you consistently reach sales goals and promote growth, improving your KPI
monitoring processes and, ultimately, your bottom line. This dashboard, which focuses on high-
level metrics, will help you create a data-driven atmosphere, which is critical for can sales and,
ultimately, profitability.
Example 4:
This financial dashboard example divides a variety of financial processes into manageable
chunks. You'll be able to see your working capital, cash conversion time, and vendor payment
error rate in detail. This set of priceless insights contains all of the required ingredients for
creating streamlined processes that reduce unnecessary costs while dramatically increasing
productivity.
Example 5:
Corporate executives, general managers, and sales and marketing departments all use these
dashboards. A sales dashboard displays information on product or retail sales, sales costs, and
other KPIs, allowing users to monitor progress against sales targets and spot potential problems.
A marketing dashboard, on the other hand, provides information on prices, response rates, lead
generation, and other metrics.
https://www.ciklum.com/blog/10-ways-to-drive-data-analytics-adoption/
[2] 10 Business Intelligence Dashboard Best Practices & Examples. (n.d.). Datapine.
https://www.datapine.com/articles/bi-dashboard-best-practices
https://searchbusinessanalytics.techtarget.com/definition/business-intelligence-dashboard
[4] Market Dynamics. (n.d.). Investopedia. https://www.investopedia.com/terms/m/market-
dynamics.asp
[5] Hazelcast. (2019, November 9). What is In-Memory Processing? An Overview with Use
Cases. https://hazelcast.com/glossary/in-memory-processing/
[6] Transforming Data With Intelligence. (2018, June 15). Transforming Data With
Intelligence. https://tdwi.org/articles/2018/06/15/adv-all-real-time-analytics-challenges-
and-solutions.aspx