Review of Virtual Instrumentation
Review of Virtual Instrumentation
Review of Virtual Instrumentation
INSTRUMENTATION
CONTENTS
Virtual instrumentation model
Graphical system design model
Block Diagram & Architecture of Virtual Instrument
Data-flow techniques
Hardware & software in virtual instrumentation
Virtual instrument and traditional instrument
Comparison with conventional programming
OPC (Object linking and Embedding (OLE) for process control)
HMI/SCADA software
Active X programming
Virtual Instrumentation
Virtual instrumentation is an interdisciplinary field that merges sensing, hardware
and software technologies in order to create flexible and sophisticated instruments
for control and monitoring applications.
Virtual instrumentation is the use of customizable software and modular
measurement hardware to create user-defined measurement systems, called virtual
instruments.
Virtual instruments are computer programs that interact with real world objects by
means of sensors and actuators and implement functions of real or imaginary
instruments.
The sensor is usually a simple hardware that acquires data from the object,
transforms it into electric signals and transmits to the computer for further
processing. Simple virtual measuring instruments just acquire and analyse data,
but more complex virtual instruments communicate with objects in both
directions.
Virtual Instrumentation Model
Virtual instrumentation is the combination of user-defined software and modular
hardware that implements custom systems (virtual instruments) with
components for acquisition, processing/analysis and presentation.
Modular Hardware - subdividing the entire hardware design into smaller parts called modules or
skids, that can be independently created and then used in different systems.
Data Acquisition
Data acquisition is the process of gathering or generating information in an
automated fashion from analog and digital measurement sources such as sensors
and devices under test.
A physical input/output signal is typically a voltage or current signal. A voltage
signal can typically be a 0-5V signal, while a current signal can typically be a
4- 20mA signal.
Data Analysis
Raw data may not convey useful information immediately.
Signal processing is frequently needed to transform the signal, remove noise
disturbances, or compensate for environmental effects.
Analysis is a fundamental part of many test, measurement, and control
applications.
Analysing a signal gives you additional insight into what your data meansyou
can get a clearer picture of your desired signal or monitor a signal for a particular
behaviour.
Data analysis may include Time Domain Analysis, Frequency (Spectral) Analysis,
Digital Filters, Curve Fitting and Data Modeling, Differential Equations, Linear
Algebra, Nonlinear Systems, Optimization, Root Finding, PID and Fuzzy Control.
Data Presentation
This stage is used to present the data in a suitable form for post analysis of data.
This is also called as data storage stage. e.g., CRO recorders, plotter and other
display devices.
One key element behind the success of the virtual instrumentation approach is
LabVIEW, a software development tool originally developed to support the
requirements of virtual instrumentation.
Researchers can acquire reference data from files or databases and incorporate it
into the model.
Results from the simulation process are saved for post-analysis and visualization
and can be used to introduce changes into the model.
The transition from the prototyping phase to the deployment phase can be very
fast and efficient because the same set of tools used for prototyping can, in most
cases, be applied to the final deployment of the system in the field.
The deploy stage is mostly about hardware - where you put your design in the
final stage.
GSD Advantages
In this case, the block diagram executes from left to right, not because the objects
are placed in that order, but because the Subtract function cannot execute until the
Add function finishes executing and passes the data to the Subtract function.
Remember that a node executes only when data is available at all of its input
terminals and supplies data to the output terminals only when the node finishes
execution.
Dataflow Example for Multiple Code Segments
Consider which code segment would execute firstthe Add, Random Number, or
Divide function.
You cannot know because inputs to the Add and Divide functions are available at
the same time, and the Random Number function has no inputs.
In a situation where one code segment must execute before another, and no data
dependency exists between the functions, use other programming methods, such
as sequence structures or error clusters, to force the order of execution.
Wires
You transfer data among block diagram objects through wires. Wires connect the
control and indicator terminals to the Add and Subtract function.
Each wire has a single data source, but you can wire it to many VIs and functions
that read the data.
Wires are different colors, styles, and thicknesses, depending on their data types.
A broken wire appears as a dashed black line with a red X in the middle, as shown
above. Broken wires occur for a variety of reasons, such as when you try to wire
two objects with incompatible data types.
Common Wire Types
Hardware in Virtual Instrumentation
Input/Output plays a critical role in virtual instrumentation.
To accelerate test, control and design, I/O hardware must be rapidly adaptable to
new concepts and products.
Virtual instrumentation delivers this capability in the form of modularity within
scalable hardware platforms.
Virtual instrumentation is software-based; if we can digitize it, we can measure it.
Standard hardware platforms that house the I/O are important to I/O modularity.
Laptops and desktop computers provide an excellent platform where virtual
instrumentation can make the most of existing standards such as the USB, PCI,
Ethernet, and PCMCIA buses.
Software in Virtual Instrumentation
Software is the most important component of a virtual instrument.
With the right software tool, engineers and scientists can efficiently create their
own applications by designing and integrating the routines that a particular process
requires.
You can also create an appropriate user interface that best suits the purpose of the
application and those who will interact with it.
You can define how and when the application acquires data from the device, how
it processes, manipulates and stores the data, and how the results are presented to
the user.
With powerful software, we can build intelligence and decision-making
capabilities into the instrument so that it adapts when measured signals change
inadvertently or when more or less processing power is required.
An important advantage that software provides is modularity.
When dealing with a large project, engineers and scientists generally approach the
task by breaking it down into functional solvable units.
These subtasks are more manageable and easier to test, given the reduced
dependencies that might cause unexpected behaviour.
We can design a virtual instrument to solve each of these subtasks, and then join
them into a complete system to solve the larger task.
The ease with which we can accomplish this division of tasks depends greatly on
the underlying architecture of the software.
A virtual instrument is not limited or confined to a stand-alone PC.
In fact, with recent developments in networking technologies and the Internet, it is
more common for instruments to use the power of connectivity for the purpose of
task sharing.
Typical examples include supercomputers, distributed monitoring and control
devices, as well as data or result visualization from multiple locations.
Every virtual instrument is built upon flexible, powerful software by an innovative
engineer or scientist applying domain expertise to customize the measurement and
control application.
The result is a user-defined instrument specific to the application needs.
Virtual instrumentation software can be divided into several different layers like
the application software, test and data management software, measurement and
control services software.
Layers of virtual instrumentation software
Most people think immediately of the application software layer. This is the
primary development environment for building an application.
It includes software such as LabVIEW, LabWindows/CVI (ANSI C),
Measurement Studio (Visual Studio programming languages), Signal Express and
VI Logger.
Above the application software layer is the test executive and data management
software layer. This layer of software incorporates all of the functionality
developed by the application layer and provides system-wide data management.
Measurement and control services software is equivalent to the I/O driver software
layer. It is one of the most crucial elements of rapid application development.
This software connects the virtual instrumentation software and the hardware for
measurement and control.
It includes intuitive application programming interfaces, instrument drivers,
configuration tools, I/O assistants and other software included with the purchase
of hardware.
This software offers optimized integration with both hardware and application
development environments.
Virtual Instrument and Traditional Instrument
A traditional instrument is designed to collect data from an environment, or from a
unit under test, and to display information to a user based on the collected data.
Ex: Oscilloscopes, spectrum analyzers and digital multimeters.
A virtual instrument (VI) is defined as an industry-standard computer equipped
with user friendly application software, cost-effective hardware and driver
software that together perform the functions of traditional instruments. Simulated
physical instruments are called virtual instruments (VIs).
With virtual instrumentation, engineers and scientists reduce development time,
design higher quality products and lower their design costs.
Virtual Instrumentation is flexible. Virtual instruments are defined by the user
while traditional instruments have fixed vendor-defined functionality.
The associations within a virtual instrument are not fixed but rather managed by
software.
Virtual Instrument and Traditional Instrument
Every virtual instrument consists of two partssoftware and hardware.
A virtual instrument typically has a sticker price comparable to and many times
less than a similar traditional instrument for the current measurement task.
A traditional instrument provides them with all software and measurement
circuitry packaged into a product with a finite list of fixed-functionality using the
instrument front panel.
A virtual instrument provides all the software and hardware needed to accomplish
the measurement or control task.
In addition, with a virtual instrument, engineers and scientists can customize the
acquisition, analysis, storage, sharing and presentation functionality using
productive, powerful software.
Without the displays, knobs and switches of a conventional, external box-based
instrumentation products, a virtual instrument uses a personal computer for all
user interaction and control.
Virtual Instrument and Traditional Instrument
The cost to configure a virtual instrumentation-based system using a data
acquisition board or cards can be as little as 25% of the cost of a conventional
instrument.
Stand-alone traditional instruments such as oscilloscopes and waveform
generators are very powerful, expensive, and designed to perform one or more
specific tasks defined by the vendor.
However, the user generally cannot extend or customize them.
The knobs and buttons on the instrument, the built-in circuitry, and the functions
available to the user, are all specific to the nature of the instrument.
In addition, special technology and costly components must be developed to build
these instruments, making them very expensive and slow to adapt.
Traditional instruments also frequently lack portability, whereas virtual
instruments running on notebooks automatically incorporate their portable nature.
Traditional instruments and software based virtual
instruments
Traditional instruments and software-based virtual instruments largely share the
same architectural components, but radically different philosophies.
A traditional instrument might contain an integrated circuit to perform a particular
set of data processing functions.
In a virtual instrument, these functions would be performed by software running
on the PC processor. We can extend the set of functions easily, limited only by the
power of the software used.
Both require one or more microprocessors, communication ports (for example,
serial and GPIB), and display capabilities, as well as data acquisition modules.
By employing virtual instrumentation solutions, you can lower capital costs,
system development costs, and system maintenance costs, while improving time to
market and the quality of your own products.
There is a wide variety of available hardware that you can either plug into the
computer or access through a network. These devices offer a wide range of data
acquisition capabilities at a significantly lower cost than that of dedicated devices.
As integrated circuit technology advances, and off-the-shelf components become
cheaper and more powerful, so do the boards that use them. With these advances
in technology come an increase in data acquisition rates, measurement accuracy,
precision, and better signal isolation.
Traditional Vs Virtual Instruments
Traditional Instruments Virtual Instruments
Vendor-defined User-defined
Function-specific, stand-alone with Application-oriented system with
limited connectivity connectivity to networks, peripherals,
and applications
Hardware is the key Software is the key
Expensive Low-cost, reusable
Closed, fixed functionality Open, flexible functionality leveraging
off familiar computer technology
Slow turn on technology (510 year Fast turn on technology (12 year life
life cycle) cycle)
Minimal economics of scale Maximum economics of scale
High development and maintenance Software minimizes development and
costs maintenance costs
Comparison of Text Based and Graphical Programming
Text Based Programming Graphical Programming
Syntax must be known to do programming Syntax is knowledge but is not required for
programming
The execution of the program is from top to bottom The execution of the program is from left to right
To check for the error the program has to be compiled Errors are indicated as we wire the blocks
or executed
Front panel design needs extra coding or needs extra Front panel design is a part of programming
work
Text based programming is not interactive Graphical programming is highly interactive
This is the text based programming where the The programming is data flow programming
programming is a conventional method
Logical error finding is easy in large programs Logical error finding in large programs is quiet
complicated
Program flow is not visible Data flow is visible
It is Text based Programming It is icon based programming and wiring
Passing parameters to sub routine is difficult Passing Parameters to sub VI is easy
OPC (Object linking and Embedding (OLE) for
process control)
OLE for Process Control (OPC), which stands for Object Linking and Embedding (OLE)
for Process Control, is the original name for a standard specification developed in 1996.
The standard specifies the communication of real-time plant data between control devices
from different manufacturers.
The OPC Specification was based on the OLE, COM, and DCOM technologies developed
by Microsoft for the Microsoft Windows operating system family.
The evolution of OLE started, in 1990, on the top of Dynamic Data Exchange (DDE)
concept of Microsoft, and was later reimplemented with Microsoft Component Object
Model (COM) and then Distributed COM (DCOM) as its bases, and eventually led to
ActiveX controls.
Industrial automation systems require open standards which can be adopted by any
provider of control systems, instrumentation, and process control for multi-vendor
interoperability.
Term Meaning
OPC OPC stands for OLE for Process Control
OPC is based on the core OLE technologies COM and DCOM.