Java Muti Threading
Java Muti Threading
Java
Process vs Thread
A process is an execution of a program but a thread is a
single execution sequence within the process. A process
can contain multiple threads. A thread is sometimes
called a lightweight process.
A JVM runs in a single process and threads in a JVM
share the heap belonging to that process. That is why
several threads may access the same object. Threads
share the heap and have their own stack space. This is
how one threads invocation of a method and its local
variables are kept thread safe from other threads. But
the heap is not thread-safe and must be synchronized
Thread states
Synchronization
Method level
class Methodlevel {
//shared among threads
Block level
Class Blocklevel {
//shared among threads
SharedResourse x,y;
//dummy objects for locking
Object xlock = new Object(), ylock = new Object();
public void method1() {
synchronized(xlock) {
SharedResourse x,y;
}
x,y
synchronized(ylock) {
}
x,y
} // end of method1()
} // end of class
Thread class
The class java.lang.Thread has the following constructors:
public
public
public
public
Thread();
Thread(String threadName);
Thread(Runnable target);
Thread(Runnable target, String threadName);
Thread class
The method run() specifies the running behavior of the
thread.
You do not invoke the run() method explicitly. Instead,
you call the start() method of the class Thread.
If a thread is constructed by extending the Thread class,
the method start() will call back the overridden run()
method in the extended class.
On the other hand, if a thread is constructed by
providing a Runnable object to the Thread's constructor,
the start() method will call back the run() method of the
currentThread Thread
getName
String
getPriority
int
isAlive
boolean
run
void
This method is the entry point for threads, like the main method
for applications.
start
void
sleep
void
wait
void
notify
void
Remove a thread from the waiting state and place it in the readyto-run state.
Multi tasking
On systems that have multiple hardware CPUs, mostly server systems, each CPU can
run a different thread. If you have multiple tasks, independent of one another, they
will complete execution in a shorter period of time since they execute simultaneously.
This can be used to great advantage on server systems since incoming client
requests are essentially independent of one another. Server applications are
intrinsically multi-threaded.
In the situation of a single CPU, threads are actually not executed simultaneously(in
parallel). Instead, the operating system manages the multiple flow of execution by
repeatedly stopping one and starting another. In other words, simultaneous execution
is actually an illusion on a single CPU system.
You may be wondering by now why it would be beneficial to simulate simultaneous
execution on a single CPU machine. Would it not introduce overheads in switching
between the different tasks, and actually take more time to complete all the tasks?
Multi tasking
Concurrency vs Parallelism
Concurrent multithreading systems give the appearance
of several tasks executing at once, but these tasks are
actually split up into chunks that share the processor
with chunks from other tasks. In parallel systems, two
tasks are actually performed simultaneously. Parallelism
requires a multi-CPU system.