Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Data Analytics

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Task 1:

 Data Analytics Adoption Planning


Adoption of analytics is a continuous operation. Drive analytics adoption by delivering solutions
and improvements that meet the needs of stakeholders. The prospect management report's aim is
to assist growth officers in prioritizing which prospects to contact first.
Example:
Gartner polled 430 businesses to find out how far along their Big Data, BI, Benchmarking, and
Analytics initiatives were. Their findings indicate that not only is adoption high, but so is
interest.

 Market dynamics and business motivation


Market dynamics are forces that influence prices and producer and consumer behavior. These
forces produce price signals in a market, which are the outcome of supply and demand
fluctuations for a given product or service. Any business or government policy may be
influenced by market dynamics. Other than price, demand, and supply, there are competitive
market forces at work.
The primary economic policy goals are to maximize productivity, sustain full employment,
reduce inflation, and improve the living standards of the population. A well-functioning market
economy is complex, with dynamic interactions between companies and households.
Understanding the sources of production and work growth necessitates the ability to quantify
these dynamics and, in turn via demand and supply factors, price inflation.
Example:
Consumer demand can be a dominant market dynamic at times. Consumer spending is the,
according to a study by The NPD Group, especially for luxury fashion products such as clothing,
accessories, and apparel.
Sales of luxury fashion products have increased, according to the January 2019 NPD survey, as
new brands have arisen and online shopping outlets have developed a more competitive
environment while gaining market share due to customer demographics and preferences.
Manufacturers and brands will be able to lift prices as demand for luxury clothing rises,
stimulating the market and boosting the overall economy.
“If we pay attention to what buyers are saying, these emerging business trends spell a great deal
of potential across the entire luxury fashion market,” says Marshal Cohen, chief industry advisor
at The NPD Group.

 Automated data acquisition and processing


During the data insertion process, the expense report management system directly includes both
the employee and administrative personnel, who must manually review receipts and verify the
accuracy of data entered by employees. This method is typically characterized by a number of
main issues: it takes time, it is prone to errors, and it is inefficient. It's tedious because employees
and back-office workers must repeat the same procedures for each cost object.

 Challenges of real-time data analysis


1. Our clients in our data consulting practice have various meanings of the word "real time."
Some people claim that real-time analytics means having immediate insights, while
others are fine with waiting several minutes between data collection and the analytics
system's response.
2. It's time to start designing the architecture after defined what real time means and clearly
stated the criteria for real-time analytics. Architecture must be capable of processing data
at a high rate. Depending on the data source and application, processing speed
requirements can range from milliseconds to minutes. Companies who aim to implement
real-time analytics often overlook the importance of offline analytics, but you need both
to gain insights.
3. Gathering specifications, designing the solution's architecture, selecting the best
technology stack, and resolving hardware and software issues are all essential aspects of
implementing real-time analytics. However, because of these technological
responsibilities, businesses often ignore the issue of what to do with their internal
processes.

Example:

A factory that uses machine health monitoring on its equipment must be notified of a possible
equipment malfunction in order for the operator to intervene in time. As a result, it's critical for
system architects, data engineers, and security architects to avoid potential stumbling blocks and
effectively deploy real-time streaming platforms. IT teams will be able to use the framework and
begin extracting information without delay if the above issues are addressed.

Task 2

 In-memory processing

In-memory processing is the act of acting on data that is completely stored in a computer's
memory (e.g., in RAM). Other data processing methods, on the other hand, rely on reading and
writing data to and from slower media such as disc drives. In-memory processing is most often
associated with large-scale environments in which many computers are pooled together so that
their combined RAM can be used as a large and fast storage medium. Large data sets can be
processed all at once because the storage exists as a single large allocation of RAM, rather than
processing data sets that only fit into the RAM of a single device.

In-memory processing relies solely on data stored in RAM and eliminates all sluggish data
accesses. Latency, which is normal when accessing hard disc drives or SSDs, has no effect on
overall processing performance. Code running on one or more computers handles both the
processing work and the data in memory, and in the case of several computers, the software
divides the processing work into smaller tasks that are assigned to each computer in parallel. The
technology known as in-memory data grids is often used for in-memory processing (IMDG).
Hazelcast IMDG is one such example, which allows users to run complex data processing jobs
across a cluster of hardware servers while maintaining extreme speed.
 Data Stream Processing
Streaming was a relatively new idea at the time. Streaming computing deals with data sources at
first. A data stream is a continuous flow of data that changes frequently and loses validity after a
short period of time. This could include transactional data, data from IoT devices, hardware
sensors, and so on.

Data sources cannot be divided into batches because they have no beginning or end. As a result,
there is no time for the data to be stored and processed. Rather, data sources are processed in real
time.

 Complex event processing

Complex event processing, also known as event, stream, or event stream processing, is the use of
technology to query data prior to storing it in a database or, in some cases, without ever storing
it. Complex event processing is an operational method for aggregating a large amount of data
and identifying and analyzing cause-and-effect relationships between events in real time. CEP
compares constantly incoming events to a pattern, giving you insight into what's going on and
allowing you to take constructive action.
Complex incidents are often linked to critical business events [such as opportunities or threats],
implying that they will be responded to in real-time, or at least as close to real-time as possible.
Task 3

 Example 1

Dashboards of this kind may include a variety of features, including a customizable interface,
some degree of interactivity, and the ability to pull data in real-time from multiple sources. They
simplify data analysis for our brain by allowing users to imagine otherwise complicated and
heavy raw data and they provide readers with an at-a-glance rundown of the past, present, and
possible permutations.
 Example 2:

The style is consistent, with many shades of blue that don't overlap, and the graphs and figures
aren't cluttered and follow a straightforward visual order, making every marketing KPI
understandable. The dashboard's message is clear at a glance, and the alignment and white space
give the eyes a break from the struggle to get to the point.
 Example 3:

Our sales dashboard, one of the best business intelligence dashboards for profitability, is focused
on helping you consistently reach sales goals and promote growth, improving your KPI
monitoring processes and, ultimately, your bottom line. This dashboard, which focuses on high-
level metrics, will help you create a data-driven atmosphere, which is critical for can sales and,
ultimately, profitability.
 Example 4:

This financial dashboard example divides a variety of financial processes into manageable
chunks. You'll be able to see your working capital, cash conversion time, and vendor payment
error rate in detail. This set of priceless insights contains all of the required ingredients for
creating streamlined processes that reduce unnecessary costs while dramatically increasing
productivity.
 Example 5:

Corporate executives, general managers, and sales and marketing departments all use these
dashboards. A sales dashboard displays information on product or retail sales, sales costs, and
other KPIs, allowing users to monitor progress against sales targets and spot potential problems.
A marketing dashboard, on the other hand, provides information on prices, response rates, lead
generation, and other metrics.

[1] 10 Ways to Drive Data Analytics Adoption. (n.d.). Ciklum.

https://www.ciklum.com/blog/10-ways-to-drive-data-analytics-adoption/

[2] 10 Business Intelligence Dashboard Best Practices & Examples. (n.d.). Datapine.

https://www.datapine.com/articles/bi-dashboard-best-practices

[3] Sutner, S. (2020, August 19). business intelligence dashboard. SearchBusinessAnalytics.

https://searchbusinessanalytics.techtarget.com/definition/business-intelligence-dashboard
[4] Market Dynamics. (n.d.). Investopedia. https://www.investopedia.com/terms/m/market-

dynamics.asp

[5] Hazelcast. (2019, November 9). What is In-Memory Processing? An Overview with Use

Cases. https://hazelcast.com/glossary/in-memory-processing/

[6] Transforming Data With Intelligence. (2018, June 15). Transforming Data With

Intelligence. https://tdwi.org/articles/2018/06/15/adv-all-real-time-analytics-challenges-

and-solutions.aspx

You might also like