Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                
0% found this document useful (0 votes)
5 views

Class 8

Uploaded by

Wakgari Waif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
5 views

Class 8

Uploaded by

Wakgari Waif
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
You are on page 1/ 36

Analysis in experimental studies

 In computing fields, in addition to the observation of subjects


before and after treatment, experimental studies refer also
specifically a research in a laboratory using tools and materials
like algorithms
 Laboratory Experiment
 Assumption
 You have selected (or developed) tools,
techniques/algorithms based on literature
 Inputs
 The problem,
 Data ,
 algorithms/techniques,

1
 various setting options
Cont…
 Procedure
 Plan number of experiments by specifying the data input,
algorithms to be used, experimental settings like test
options …
 Write /report the results of each experiments
 Set criteria and compare the experimental results
 Output
 Accuracy/performance etc.. of an experiment
 Lessons learned, new insights, new knowledge,
 Interpretation and Description of how it is going to be
used

2
Cont…

Experiments Algorithm Test option Attributes Instances


to be used

Exp-One Randomforest 10-fold 17 10000


validation
Exp-Two J48 10 –fold 17 10000
validation
Exp-Three PART 10 –fold 17 10000
validation
Exp-Four Randomforest 80/20

Exp-Five J48 80/20

Exp-Six PART 80/20

3
Cont…
 Simulation Analysis
 Is a different flavor of laboratory experiment
 Simulation analysis is a descriptive modeling technique.
 As such, simulation analysis does not provide the explicit problem
formulation and solution steps

 Consequently, one must specify in detail a procedure for the development


and use of simulation models to assure successful outcome from a
simulation study.

 While laboratory experiment focus on measuring the impact of


variables on a target issue, simulation is measuring the
effectiveness of a given model/system in an artificial
environment
 It also involves experiments in it
4
Cont…
 Elements of Simulation Analysis
 Problem Formulation: questions for which answer are sought, the variables
involved and measures of system performance to be used
 Data Collection : assembling the information necessary to further refine our
understanding of the problem.
 Model Development: building and testing the model of the real system,
selecting simulation tool (programming language), coding the model
and debugging it
 Model Verification and Validation: establish that the model is an
appropriate accurate representation of the real system.
 Model Experimentation and Optimization: precision issues, how large
sample (simulation time) is necessary to estimate the performance of
the system.The design of effective experiments with which to answer
the question asked in the problem formulation.
 Implementation of Simulation result: acceptance of the result by the users and
improved decision making stemming from the analysis. 5
Cont….
Major Iterative Loops in a Simulation Study
Problem Formulation

Data Collection

Model development

Model Verification and Validation

Model Experimentation and Optimization

Implementation of Simulation Results

6
Validity and Reliability
 Construct Validity
 Concepts being studied are operationalized and measured
correctly
 Internal Validity
 Does the data support the conclusion?
 External Validity
 Establish the domain to which a study’s findings can be
generalized
 Can the result be generalized?
 Experimental Reliability
 Demonstrate that the study can be repeated with the same
results
 Is the method/conduct of the research systematic and logical?
Analysis in Design Researches
 Analysis through iterative design and evaluation process
 Suggest, Design, Construct and Evaluate an artifact as a
search process
 Following standards and principles is important (eg OOSE)
 Creativity has also significant place
 Iterations and improvement has to be reported
 Recall the iterative design cycles
 May need input from mainly qualitative type of analysis but
from quantitative also

 Mainly analysis as a deductive process of checking an


abductively identified/ suggested solutions through an
iterative design, development and evaluation process

8
Data collection
Analysis
9
Main perspectives in design deductive
analysis
 Structure of the artifact: the information space the artifact spans
 basis for deducing all required information about the artifact
 determines the configurational characteristics necessary to enable the
evaluation of the artifact
 Evaluation criteria: the dimensions of the information space which
are relevant for determining the utility of the artifact
 can differ on the purpose of the evaluation
 Evaluation approach: the procedure how to practically test an artifact
 defines all roles concerned with the assessment and the way of
handling the evaluation
 result is a decision whether or not the artifact meets the evaluation
criteria based on the available information.
 Can be qualitative or quantitative

10
Methods
Structure Evaluation Evaluation approach
criteria
⚫process-based meta ⚫appropriateness ⚫laboratory research
model ⚫completeness ⚫field inquiries
⚫intended ⚫consistency ⚫surveys
applications ⚫case studies
⚫conditions of
⚫action research
applicability
⚫practice descriptions
⚫products and results
⚫interpretative
of the method
application research
⚫reference to
constructs
Models
Structure Evaluation criteria Evaluation approach
⚫ domain ⚫correcteness ⚫syntactical validation

⚫ scope, purpose ⚫completeness ⚫integrity checking

⚫ syntax and ⚫clarity ⚫sampling using


semantics ⚫flexibility selective matching of
⚫ terminology ⚫simplicity
data to actual external
⚫ intended
phenomena or trusted
⚫applicability
application surrogate
⚫implementability
⚫integration tests

⚫risk and cost analysis

⚫user surveys
Instantiations
Structure Evaluation Evaluation approach
criteria
⚫executable implementation in ⚫functionality ⚫Code inspection
a programming language ⚫usability ⚫Testing
⚫reference to a design model ⚫reliability ⚫Code analysis
⚫reference to a requirement ⚫performance ⚫Verification
specification ⚫supportability ⚫Accptance study
⚫reference to the
⚫Usability analysis
documentation
⚫reference to quality
management documents
⚫reference to configuration
management documents
⚫reference to project

⚫management documents
Validity and reliability in deign research
 Pragmatic Utility through scientific methods
 Does it solve the initial problem?
 Was it correct and repeatable
 Is it acceptable by the respective beneficiaries
 Rigorous evaluation

14
Interpretation and Discussion

15
Interpretation and Discussion
 In strict sense
 Interpretation is telling what the result means
 Discussion is telling how good/better or related the result is
compared to other works.
 Explain the results in light of
 Previous literatures and theories.
 Own RQ and/or objectives
 No clear distinction in case of qualitative
 Involves reporting results of demonstration and evaluation in
case of design research
 May be required to collect data and analysis with the same
procedures as we have seen before
16
Interpreting the Data
 Interpreting the data means several things, in particular,
it means:

 Relating the findings to the original research problem


and to the specific research questions and hypotheses.
 Researchers must eventually come full circle to their
starting point – why they conducted a research study in
the first place and what they hoped to discover – and
relate their results to their initial concerns and questions
Cont…
 Relating the findings to preexisting literature,
concepts, theories, and research studies.
 To be useful, research findings must in some way be
connected to the larger picture – to what people
already know or believe about the topic in question.

 Perhaps the new findings confirm a current


theoretical perspective.

18
Cont…
 Determining whether the findings have practical
significance as well as statistical significance.
 Statistical significance is one thing; practical
significance – whether findings are actually useful – is
something else altogether.

 Identifying limitations of the study. Finally,


interpreting the data involves outlining the weaknesses
of the study that yielded them.
 No research study can be perfect, and its imperfections
inevitably cast at least a hint of doubt on its findings.
Good researchers know – and they also report – the
weaknesses along with the strengths of their research
Example: Types of results and their discussion
may take two forms in design science research

20
TOOLS OF RESEARCH- in relation to data
collection and analysis
 Tools are chosen to facilitate research tasks
 Some researchers need special tools indigenous to particular discipline
 The concern here is with the general tools of research that the
majority of researchers, regardless of discipline and situation,
typically need to derive meaningful and insightful conclusions from
the data they collect
 Be careful not to equate tools of research with the methodology of
research
 There are six general tools of research:
1. The library and its resources
2. The computer and its software
3. Techniques of measurement
4. Statistics
5. The human mind
6. Language
1. The Library and Its Resources

 Scholar should know its principal resources and understand its


classification system, and find the shortest route to the
information it contains.
 Both physical and online databases
 You learn the library by using the library
 Feel the mood of research in a library
 Libraries have manual : Learn where the various holdings are
located
 Catalogue is the heart of the library – books , films, filmstrip, tapes,
phonograph records, microfilm, maps, pictures, slides, CDs, …
2. The Computer and its Software – Tool of
Research
 Software tools
 Weka, NLTK, Ns-3 and other networking monitoring tools, etc…
 Be familiar with at least few selected tools in the area of research
you are engaged
 Use appropriate tools with logical settings.
 Taking advantage of the Internet
 Web site: journals, publishers, organizations, individuals, etc.
 Search engine: googlescholar , yahoo, Alta Vista, etc.
 E-Mail (Asking questions to authors, experts, etc., Facilitate
collaboration among people, Attached file (reports, etc.)
 News (List servers: E-discussion group., Many groups with
particular interests)
3. Measurement – Tool of Research

 Researchers strive for objectivity: not influenced by own


perceptions, impressions, and biases.
 Therefore, must identify systematic way of measuring a
phenomenon
 In most case numerically
 But also qualitatively
 Old proverb – if it exists, then it can be measured
 If it is researchable, then data must be measurable
4. Statistics as a Tool of Research
 All tools are MORE suitable for some purposes than for others.
 Example : screw driver was designed to INSERT and remove screws –
BUT people often used it for punching holes, scratch away paints,
etc…..misuse……
 So, too, with statistics- Statistics can be a powerful tool when used
correctly (for specific kind of data & research questions)
 BUT can be misleading when applied in other contexts.
 More useful in some academic disciplines than in others.
 REMEMBER, the statistical values obtained are never the end of research
nor the final answer to research problem.
 The final question is “What do the data indicate” not what is their
numerical configuration.
 Statistics give information about the data BUT a conscientious
researcher is not satisfied until the MEANING of this information is
revealed.
5. The Human Mind – Tool of Research
 Statistics can tell us the center, the spread, relationship of data BUT
cannot interpret and arrive at a logical conclusion or meaning.
 Only mind can do.
 Mind is the most important tool.
 Nothing equals its powers of comprehension, integrative reasoning
and insight
 Strategies to make use of the human mind to better understand
include :
1. Deductive logic
2. Inductive reasoning
3. Scientific method
4. Critical thinking
5. Collaboration with others
Cont…
 Critical Thinking
 Good researchers engage in critical thinking.
 Involves evaluating information or arguments in terms of
their accuracy and worth
 As an example
 During LR don’t just accept research findings and theories at
face value.
 Scrutinize for faulty assumptions, questionable logic,
weaknesses in methodology, inappropriate statistical analyses,
and unwarranted conclusions.
Cont…
 Critical Thinking: Take a variety of forms, depending on the
context.
 Verbal reasoning – Understanding and evaluating the persuasive
techniques found in oral and written language.
 Argument analysis – Discriminating between reasons that do and
do not support a particular conclusion.
 Decision making – Identifying and judging several alternatives and
selecting the best alternative.
 Critical analysis of prior research.: Evaluating the value of data and
research results in terms of the methods used to obtain them and
their potential relevance to particular conclusion.
Cont…
 Collaboration with Others
 More heads are better than one.
 A researcher has certain perspectives, assumptions, and
theoretical biases – not to mention holes in knowledge
about subject matter – that limit research approaches of a
project.
 Need to bring colleagues who have perspectives,
backgrounds, and areas of expertise somewhat different –
more cognitive resources to tackle research problem and
how to find meaning.
6. Language as a Tool of Research

 Human kind’s greatest achievements – facilitate


communication and think effectively.
 Can think more clearly and efficiently when can represent
thoughts with specific WORDS and PHRASES
 Good words, even a simple one, can
1. Reduce world’s complexity
2. Facilitate generalization and inference drawing in new
situation.
3. Allow abstraction of the environment
4. Enhance the power of thought
Exercise
 Relating research problems, objectives and results
 Problem statements should lead to defining a clear
objectives
 Objectives should lead to clear results and interpretations

 Read the following problem descriptions and


1. Craft a candidate general objectives for both.
2. Suggest possible/logical result of the studies and relevant
interpretations expected

31
32
33
34
Review questions
 Explain the central concept in qualitative analysis
 Describe the difference between survey experiment and
laboratory experiment methods from IT research perspective
 Explain the essence of analysis in design science research
 What is the purpose of factor analysis?
 Describe knowledge contribution of a typical Design science
research
 How do you interpret your research findings?
 Enumerate at least three major challenges in data collection
phase of a research.

35
Make sure that your data
collection and analysis is
logical!!!

36

You might also like