Automation in Construction: Takehiro Tsuruta, Kazuyuki Miura, Mikita Miyaguchi T
Automation in Construction: Takehiro Tsuruta, Kazuyuki Miura, Mikita Miyaguchi T
Automation in Construction: Takehiro Tsuruta, Kazuyuki Miura, Mikita Miyaguchi T
Automation in Construction
journal homepage: www.elsevier.com/locate/autcon
Keywords: Improving of the productivity of construction work is an urgent task, because the shortage of construction
Marking workers in Japan has become a major social issue. Marking work for building materials that is currently per-
Mobile robot formed manually is an indispensable part of the construction process. Its possible automation would allow
Omnidirectional vehicle construction workers to focus on the installation step and thus significantly boost their productivity. In this study
Tracking
an automated mobile robotic system for marking free access floors has been developed. Building such floors
Free access floor
Construction site
requires drawing grid-pattern lines whose intersections (representing the positions of future pedestal bases) can
be automatically marked by the proposed system consisting of a mobile robot with a marking device at its center
and a laser positioning unit. Cross marks are drawn on the floor by controlling the marking device, while the
laser positioning unit accurately moves a line laser along a designated direction and measures the distance to a
projected object. The marking robot tracks the line laser and arrives at a designated point. The deviation be-
tween the designated position and the robot center is calculated, and the cross mark is drawn exactly at the
former location. The marking performance of the developed robotic system has been evaluated by conducting an
experiments inside a 6 m × 6.5 m area at a construction site. It is found that the average deviation between the
marked and designated positions is 2.3 mm, while 77% of all deviations are smaller than 3 mm.
1. Introduction In this study a fully automated mobile robotic system for marking
free access floors is proposed. To build such floors correctly, grid-pat-
The shortage of construction workers in Japan has become a major tern lines should be drawn on the floor surface before installation. The
social issue (their number in 2015 was approximately 73% of the peak intersections of these lines automatically drawn by the developed
number recorded in 1997). Therefore, improving the productivity of marking robot (guided to target positions by a laser marker) denote the
construction work is a pressing issue. Marking work is indispensable in positions for installing floor pedestal bases. The three-dimensional (3D)
the construction process. Before installing building components (such measuring instrument determines the robot position accurately, which
as dry walls, free access floors and anchor bolts) construction workers is subsequently used to draw cross marks. After that, the robot moves to
must first draw marks and lines on the floors, walls, ceilings, and other the next target position and repeats the entire operation. Because the
places at the construction site, which indicate their future positions. marking process is fully automated, providing all coordinates of the
During the construction of big buildings, more than a thousands of such target positions to this system is sufficient for its successful completion.
marks and lines should be drawn manually, which requires high effi-
ciency and automation level. 2. Related literature studies
The progress achieved by a surveying technology in recent years is
remarkable. Some measuring instruments such as the laser scanner and 2.1. Robot navigation
total station are widely used at construction sites. Prior studies focused
on improving the efficiency of the marking work by these instruments When an automated construction robot performs a task at a con-
can be classified into the support technology aimed at increasing the struction site, it has to move to its working location. After completion, it
effectiveness of the manual marking process [1–3] and those on auto- either returns or moves to the next job. Therefore proper navigation of
mated marking robots [4–11]. If the marking work is fully automated the construction robot within the working area is extremely important.
using these robots, construction workers will be able to concentrate on Odometry (dead reckoning) is one of the best approaches to achieve this
installing building parts, thus increasing their productivity. goal in terms of practicality, price, and performance. However, this
⁎
Corresponding author.
E-mail addresses: tsuruta.takehiro@takenaka.co.jp (T. Tsuruta), miura.kazuyuki@takenaka.co.jp (K. Miura), miyaguchi.mikita@takenaka.co.jp (M. Miyaguchi).
https://doi.org/10.1016/j.autcon.2019.102912
Received 13 January 2019; Received in revised form 18 June 2019; Accepted 18 July 2019
Available online 01 August 2019
0926-5805/ © 2019 The Authors. Published by Elsevier B.V. This is an open access article under the CC BY-NC-ND license
(http://creativecommons.org/licenses/BY-NC-ND/4.0/).
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Several studies on automated mobile marking robots have been 3. Composition of the marking robotic system
performed previously. The system constructed by Tanaka et al. was able
to autonomously move to a target point and draw specified figures on a 3.1. General outline
ceiling board [4]. Its accuracy was 10 mm, and the time required for
drawing a single point was about 8 min. The developed robot had to In the proposed system composed of a laser positioning unit (LPU)
detect the edges of three pillars at a construction site by using a laser and a marking robot (MR), target marking positions are assumed to be
range finder (LRF). If the distance from the robot to a pillar was too set in advance on the floor plan (its general outline is shown in Fig. 1).
large or the pillar was covered with a fire resistant material, its marking Here, the marking flow is divided into the MR guidance phase and
accuracy was low. Jensfelt et al. [5] developed a mobile robotic system cross-mark drawing phase. In the former phase, the MR roughly moves
for automatically marking the positions of stands for a trade fair or around the target position, while in the latter phase, the MR position is
exhibition. Their system used the information obtained from the ex- determined with high accuracy, and a cross-mark is drawn on the floor.
isting computer aided design (CAD) model of the environment and a These two steps are repeated automatically.
two dimensional laser scanner to localize the robot and was able to In the MR guidance phase, the MR travels from the current position,
mark 492 points in the area of 10,150 m2. The total time required for A(ra,θa), to the next designated position B (rd,θd) set in advance. First,
performing this task was about 4.5 h. The average absolute error was the LPU directs the measuring laser parallel to the floor toward the
28 mm, and the system performance in terms of the marking area and position B. Second, the LPU turns the line laser at angle θd. The MR base
speed was excellent. However, the accuracy of this system was in- includes several cameras and screens. The cameras capture the screen
adequate for some types of layout marking at construction sites. Abidin image, onto which the laser spot and line are projected. The MR tracks
et al. [6] also developed an autonomous mobile robotic system to the moving laser line and travels along the circumference of the circle
perform floor marking for a trade fair or exhibition, which was based on having the LPU at its center while maintaining the screen orientation
the image captured by the camera mounted on the top of the area to be perpendicular to the direction of the line laser. When the angle of the
marked. The marking accuracy of this system was about 10 mm. line laser reaches θd, the LPU halts the line laser movement.
However, its camera had to be installed on the ceiling and calibrated Subsequently the MR moves backward or forward along the straight
depending on the installation position, which was very laborious and line connecting the LPU and the position B toward the latter while
costly. Moreover, the marking area in that case was limited by the vi- maintaining the screen orientation perpendicular to the line laser.
sual field of the camera. Inoue et al. reported a system using a total When the difference between the measured distance and the designated
station and an LRF to determine the robot position [7–9]. Its marking distance rd become smaller than the threshold, the MR stops. The MR
accuracy was within about 2 mm, and the time required for marking arrival position does not exactly correspond to the designated position,
after the robot reached the target area was about 2 min. In this system, and the position of the MR center is calculated from the combination of
the attitude angle of the mobile robot could not be measured during the LPU with the screen image.
moving. No navigation method for moving from one mark point to In the cross-mark drawing phase, the MR draws a cross mark exactly
another point has been reported previously. The “Laybot” developed by at the designated position using a special marking device that can move
DPR Construction [10] was capable of drawing lines on the slabs for the marking pen in the x, y, and vertical directions. The movement
drywall layouts at a speed from 300 to 400 ft per hour while running. Its value of this pen is calculated by comparing the position B with the
marking accuracy was within an eighth of an inch. Kitahara et al. re- current position of the MR center. By controlling the marking device
ported a robotic system, which could draw line segments on the floors based on the calculated movement value, a cross mark with the center
and walls of construction sites with an accuracy of 1 mm or smaller corresponding to the position B is drawn on the floor.
[11]. However, their system was not automated and required an
2
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Fig. 2. A kinematic model of the omnidirectional vehicle. Fig. 5 summarizes the relationship existing between different
hardware modules of the MR and LPU components. The software used
3.2. Marking robot for their communication and integration is based on the Robot Oper-
ating System (ROS) [25]. Here, computer A of the MR conducts three
In the developed system, an omnidirectional vehicle (ODV) is used main tasks. First, it performs image processing for detecting the laser
as the movement vehicle. It has three omnidirectional wheels mounted position projected on the screen. Second, it sends commands to the
symmetrically at angle of 120 degrees. Each wheel is driven by a DC motor driver for the ODV and the XY movement stage. Third, it ex-
motor and located at the same distance from the robot center. Unlike changes various parameters such as the measured distance, rotation
the non-holonomic robots, this type of vehicle has a full mobility in the angle of the 3D-MI and target position with computer B for controlling
floor plane indicating that at each instant it can move in any direction the LPU through Wi-Fi connection. The ODV motor driver controls the
without any reorientation [23]. The rotational speed of each wheel can motors for all omni-wheels, and the motor driver for the XY movement
be controlled independently by sending commands to the motor driver. stage controls the motors for the X, Y and Z axes. Computer B controls
The kinematic model of the MR is shown in Fig. 2. For the convenience both 3D-MI and rotation stage; it sends some commands (such as those
of description, three coordinate systems are used in this work: ΣW, the directing the measuring laser along a designated direction and mea-
world coordinate system; ΣR, the robot coordinate system; and ΣL, the suring the distance). It also sends commands to specify the rotation
laser coordinate system, in which the Yl axis corresponds to the direc- angle and speed of the rotation stage.
tion of the line laser, and the origins Ow and Ol are identical. Here ωi
(i = 1,2,3) denotes the rotational velocities of the wheels (i represents 4. Controlling of the MR and the LPU components
the wheel number), Rw is the radius of the wheel, and L is the distance
from the robot center to the wheel. The parameters xl, yl, and θl re- 4.1. Measuring the MR position
present the positions and attitude angle in ΣL. The inverse-kinematic
model in ΣL is described by Eq. (1) [24]. The dots over the xl, yl, and θl The MR position is calculated from the positions of the laser line and
indicate derivatives of these variables with respect to time. This equa- laser spot projected onto the screens and distance from the LPU. As
tion is used to calculate ωi (i = 1,2,3) for the MR motion control as shown in Fig. 6, the attitude angle θl in ΣL is computed using Eq. (2), in
described in Section 4.3. which, parameters d1 and d2 are the x-coordinates of the laser line
projected on each screen in ΣR. In the cross mark drawing phase, a laser
1
2 spot is projected on screen 1. Variables Δxr and Δyr, which represent the
3 movement values of the pen unit in the x and y-axis directions in ΣR
1 respectively, are calculated via Eqs. (3) and (4), respectively, using the
= [ cos sin l L cos( + 2 /3) sin( + 2 /3) L
Rw
l l l
geometric relationship depicted in Fig. 7. Here, dspot is the x-coordinate
xl of the laser spot in ΣR, rscreen is the distance from the LPU to screen 1,
and θs is the attitude angle of the MR with respect to the direction of the
cos( + 4 /3) sin( + 4 /3) L] yl
l l
measuring laser. The laser spot does not necessarily overlap with the
l (1) laser line because of the rotation stage control error. In the proposed
system, a servomotor is used for the LPU rotation stage. According to its
A photograph of the MR is shown in Fig. 3. Here, two screens and
technical specifications, the backlash is smaller than 0.5 degrees. θl is
cameras are placed on the metal frames installed on the ODV base.
used instead of θs in Eqs. (3) and (4). The MR is programmed to stop
Screens 1 and 2 are oriented parallel to each other and perpendicularly
when rscreen – rd < 30 mm. Thus, the maximum error of the Δxr and Δyr
to the floor. The distance between them is ys. Cameras 1 and 2 face the
due to the usage of θl instead of θs estimated at θs are 0 degree and 90
screens 1 and 2, respectively, and capture the corresponding images of
degrees is 0.26 mm. By controlling the marking device based on the
the projected laser spot and line. The marking device is installed at
values calculated by Eqs. (3) and (4), a cross mark is drawn on the
robot center; it consists of the XY movement stage and the pen unit,
designated position with high accuracy.
which can move the marking pen in the vertical direction. The position
of the pen unit can be adjusted using the slider rails. By controlling the d2 d1
tan =
marking device, the cross mark is drawn on the floor. l
ys (2)
This unit is composed of a guide laser unit (GLU) and a 3D yr = (rscreen rd ) cos s (4)
3
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Fig. 5. A diagram describing different modules of the LPU and the MR component.
4
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Fig. 7. Screen layout and laser spot projected on the screen. umax, j
dj = 1/2 W
NH (6)
4.2. Detection of the laser line and laser spot where, umax,j (j is the screen number) is the horizontal camera co-
ordinate with the largest pixel intensity; NH is the number of horizontal
It is difficult to determine the laser position accurately from a pixels; and W is the screen width (see Fig. 9c).
captured image without image preprocessing because the captured The laser spot is red (see Fig. 10a). A binarized image is obtained
image has two types of distortions that must be removed; lens distortion from the red channel image utilizing a binarization threshold, whose
and perspective distortion. The former can be eliminated by using value is set to 250 in this study. The noise level in the binarized image is
Zhang's method [26], and Fig. 8a and b show the screen images cap- reduced by three times erosion processing with a 1 × 3 kernel. The
tured before and after the lens distortion removal. The perspective parameters of image processing are empirically determined and used in
distortion is removed by the homographic transformation based on the all experiments performed in this work. As shown in Fig. 10c, the laser
following formulas: line is removed after image processing leaving only the laser spot intact.
From this image, the area centroid pixel of the detected spot is de-
a + bsi, j + cti, j termined. After that, as shown in Fig. 10d, a masked image is obtained
u i, j =
1 + gsj + ht j by setting the pixel intensity to zero except for the square region of
d + esi, j + fti, j 50 × 50 pixels centered on the area centroid pixel in the red channel
vi, j =
1 + gsi, j + hti, j (5) image. The horizontal camera coordinate of the laser spot, umax,spot, is
calculated as the brightness center of gravity in this masked image. The
where.u i,j,v i,j: camera coordinates after the transformation. (i is the x-coordinate of the laser spot in ΣR, dspot, is also determined by Eq. (6).
screen number and j is the index of the known point).si,j,t i,j: camera The image processing procedure described above is implemented by
coordinates before the transformation. (i is the screen number and j is OpenCV and conducted by the computer installed on the MR. The
(si,1,ti,1) (si,2,ti,2)
(si,3,ti,3)
(si,4,ti,4)
(ui,3,vi,3) (ui,4,vi,4)
v
c. Image after homographic transformation
Fig. 8. Images obtained before and after distortion removal.
5
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
calculation cycle depends on the computer performance. In the pro- rotational speed of each wheel is determined by Eq. (1).
posed system, it is approximately 0.17 s.
Fig. 10. Laser spot images obtained before and after image processing.
6
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Table 1 Table 2
Description of the parameters used in Fig. 11. Values of some parameters used in Fig. 11.
Parameters Details Parameter Value
(xl, yl, θl) The coordinate and attitude angle of the MR in ΣL xl, r 0
(xl, r, θl, r) The reference x-coordinate and attitude angle of the MR in ΣL θl, r 0
(ex, eθ) The deviation between the reference and measured value yl,d 0 or 50a
(x l, d , yl, d , l, d ) The direction value of the translational and rotational speed of Kp,x, Kp,θ, KI,x, KI,θ 20, 0.244, 4, 0
the MR in ΣL
(ω1, ω2, ω3) The direction value of the rotational speed of each wheel a
The values of 0 and 50 are used for traveling along the
(x l , yl , l)
The translational and rotational speed of the MR in ΣL circular and straight paths, respectively.
s Differential operator
7
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
40 10
20 8
0 6
-20 4
2
x l[mm]
θ l[deg]
-40
0
-60
-2
-80 -4
0.3deg/sec 0.3deg/sec
-100 -6
0.6deg/sec 0.6deg/sec
-120 -8
0.9deg/sec 0.9deg/sec
-140 -10
0 10 20 30 40 50 0 10 20 30 40 50
Time[sec] Time[sec]
Fig. 13. Tracking performance evaluated during traveling along the circular path.
6. Discussion
50 10
25 σ=2.0mm 5 σ=0.5deg
x l[deg]
θ l[deg]
0 0
-25 -5
-50 -10
0 5000 10000 15000 20000 25000 30000 0 5000 10000 15000 20000 25000 30000
yl[mm] yl[mm]
a. Relationship between xl and yl b. Relationship between θl and yl
Fig. 15. Tracking performance evaluated during traveling along the straight line path.
8
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
1 1 Table 3
Frequency distribution
Frequency distribution
Ave: 0.9mm Ave: 0.03deg Results of the marking accuracy measurements conducted during laboratory
0.8 0.8
σ=1.2deg testing.
0.6
σ=26.8mm 0.6
Parameters Δx Δy x2 + y2
0.4 0.4
Average deviation(mm) 0.7 −0.6 1.5
0.2 0.2
Standard deviation (mm) 1.1 0.9 0.9
0 0 Maximum deviation (mm) 3.9 −2.6 4.2
-100 0 100 -10 0 10 Fraction of deviations < 3 mm (%) 96 100 94
xl[mm] θl[deg]
a. Frequency distribution of xl b. Frequency distribution of θl
other type utilizes external measuring instrument (such as 3D ones).
Fig. 17. The performance characteristics of the line laser tracking process. The system described in Ref. [5] belongs to the first category. Although
it moves and draws extremely fast, its accuracy is > 10 times worse
Although it is difficult to achieve the speed of the construction workers, than those of the other systems because its localization procedure
our goal is to be able to mark 100 points per hour. In the case of dry mainly utilizes inner sensors, which are not suitable for marking at
partition walls, the marking pattern is more complicated than that used construction sites. In the second category, only the system developed in
for free access floors. Therefore, we expect that marking such walls by this study uses a reflectorless 3D measuring instrument. Although this
the automated robot system would be more effective. instrument cannot track and collimate prism target, it is less expensive
than the instrument that can. In the system reported in Ref. [11], an
operator of the system has to move a robot to the marking points with
6.3. Comparison to the result of previous studies
operation of a remote controller. The time required for marking a single
position reported in Refs. [7–9] is 2 min (it excludes the time of moving
The accuracy, time required to mark a single position, and other
from the former position to the next one). Meanwhile, the system de-
parameters of the proposed system are compared with those of the
veloped in this work is fully automated and can mark a single position
previously reported marking systems [5,7–11] in Table 5. These sys-
in 98 s including the time for moving to the next position, while its size
tems can be roughly divided into two types. The first type includes the
and weight of proposed system are smaller than those of the systems
system using only inner sensors (such as the laser range finder), and the
20 20
15 15
frequency
frequency
10 10
5 5
0 0
∆ x[mm] ∆ y[mm]
a. Frequency distribution of ∆x b. Frequency distribution of ∆y
Fig. 18. Frequency distributions of Δx and Δy obtained during laboratory testing.
9
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Fig. 19. Experimental setup used during testing at the construction site.
described previously. Thus, except for the accuracy, the developed Table 4
robot is superior to the other reported systems. Results of the marking accuracy measurements conducted at the construction
site.
Parameter Δx Δy x2 + y2
7. Conclusions and future research
Average deviation(mm) 1.7 0 2.3
From the results obtained in this work, the following conclusions Standard deviation (mm) 1.9 1.4 1.6
can be drawn. Maximum deviation (mm) 9.5 4.5 10.2
Fraction of deviations < 3 mm (%) 87 97 77
1. Automated marking of the 6 m × 6.5 m area of the free access floor
at a construction site was successfully performed.
2. The average deviation between the marks drawn by the MR and the significantly affected by the sunlight, which represents a major problem
designated positions was 2.3 mm, and 77% of all deviations were for automated marking during daytime. To solve this problem, a new
below 3 mm. MR guidance method without a line laser must be developed.
3. The average marking time was equal to 98 s, which was shorter than When the MR of the proposed system was occluded by an obstacle
that of the previously reported system using 3D measuring instru- (such as a column or a wall), its position could not be determined by the
ment. 3D-MI unit. Thus, the developed system may be effectively used in non-
occluded area, which is also a limitation of the systems described in
However there are two major limitations in this study that should be previous works [7–11]. A possible solution to this problem is the si-
addressed in future research. When the proposed marking system was multaneous operation of multiple systems. Because the proposed ro-
tested outside the 6 m × 6.5 m area, the MR guidance procedure failed botic marking system contains an inexpensive reflectorless 3D mea-
due to the sunlight interference (see Section 6.1). At the same time, the suring instrument, its operation cost may be significantly lower than
laser spot of the measuring laser could be detected. These results sug- those of the other systems.
gest that the MR guidance method by using the line laser has been The experiment conducted at the construction site revealed the
∆xave=1.7mm ∆yave=0mm
60 60
50 50
40 40
frequency
frequency
30 30
20 20
10 10
0 0
∆ x[mm] ∆ y[mm]
a. Frequency distribution of ∆x b. Frequency distribution of ∆y
Fig. 20. Frequency distributions of Δx and Δy obtained during test at the construction site.
10
T. Tsuruta, et al. Automation in Construction 107 (2019) 102912
Table 5
Performance parameters of the marking robotic system.
Parameter Proposed system System reported in [5] System reported in [7–9] System reported in [11]
a
: The distance between the adjacent marking positions is 50 cm
b
: Excluding the movement to the marking position
c
: 3D measuring instrument that cannot collimate and track prism
d
3D measuring instrument that can collimate and track prism.
limitations of the developed system. Currently we are developing a new dimensional measuring instruments, Proceedings of the 35th International
robotic system for solving the first limitation. We want to use the robot Symposium on Automation and Robotics in Construction, 2018, pp. 292–299, ,
https://doi.org/10.22260/ISARC2018/0042 Berlin, Germany.
guidance technology developed in this study not only for the marking [12] M. Lindholm, S. Pieska, L. Koskela, A 3-D measurements in construction robot-
work but also that for the installing building materials in the future. ization, Proceedings of the 7th International Symposium on Automation and
Robotics in Construction, 1990, pp. 126–133, , https://doi.org/10.22260/
ISARC1990/0017 Bristol, United Kingdom.
References [13] Y.J. Beliveau, J.E. Fithian, M.P. Deisenroth, Autonomous vehicle navigation with
real-time 3D laser based positioning for construction, Autom. Constr. 5 (1996)
[1] S. Sakamoto, H. Kishimoto, K. Tanaka, Y. Maeda, 3D measuring and marking system 261–272, https://doi.org/10.1016/S0926-5805(96)00140-9.
for building equipment: developing and evaluating prototypes, Proceedings of the [14] Y. Fukase, H. Kanamori, S. Kimura, Self-localization system for robots using random
26th International Symposium on Automation and Robotics in Construction, 2009, dot floor patterns, Proceedings of the 30th International Symposium on Automation
pp. 131–138, , https://doi.org/10.22260/ISARC2009/0001 Austin, USA. and Robotics in Construction, 2013, pp. 304–312, , https://doi.org/10.22260/
[2] S. Sakamoto, N. Kano, T. Igarashi, H. Kishimoto, H. Fujii, Y. Oosawa, K. Minami, ISARC2013/0033 Montréal, Canada.
K. Ishida, Laser marking system based on 3D CAD model, Proceedings of the 28th [15] Yutaro Fukase, Hiroshi Kanamori, Hiroshi Taga, Application of self-location system
International Symposium on Automation and Robotics in Construction, 2011, pp. using a floor of random dot pattern to an automatic guided vehicle, Proceedings of
64–69, , https://doi.org/10.22260/ISARC2011/0009 Seoul, Korea. the 33rd International Symposium on Automation and Robotics in Construction,
[3] S. Sakamoto, N. Kano, T. Igarashi, H. Tomita, Laser positioning system using RFID- 2016, pp. 42–49, , https://doi.org/10.22260/ISARC2016/0006 Auburn, USA.
tags, Proceedings of the 29th International Symposium on Automation and Robotics [16] T. Bailey, H. Durrant-Whyte, Simultaneous localization and mapping (SLAM): part
in Construction, 2012, https://doi.org/10.22260/ISARC2012/0049 Eindhoven, II, IEEE Robot. Autom. Mag. 13 (3) (2006) 108–117, https://doi.org/10.1109/
Netherlands. MRA.2006.1678144.
[4] K. Tanaka, M. Kajitani, C. Kanamori, H. Itoh, Y. Abe, Y. Tanaka, Development of [17] M.G. Dissanayake, P. Newman, S. Clark, H.F. Durrant-Whyte, M. Csorba, A solution
marking robot working at building sites, Proceedings of the 12th International to the simultaneous localization and map building (SLAM) problem, IEEE Trans.
Symposium on Automation and Robotics in Construction, 1995, pp. 235–242, , Robot. Autom. 17 (3) (2001) 229–241, https://doi.org/10.1109/70.938381.
https://doi.org/10.22260/ISARC1995/0030 Warsaw, Poland. [18] S. Thrun, Simultaneous localization and mapping, Robotics and Cognitive
[5] P. Jensfelt, G. Gullstrand, E. Förell, A Mobile robot system for automatic floor Approaches to Spatial Mapping, Springer, 978-3-540-75386-5, 2007, pp. 13–41, ,
marking, J. Field Rob. 23 (6–7) (2006) 441–459 June https://doi.org/10.1002/rob. https://doi.org/10.1007/978-3-540-75388-9_3.
20125. [19] D. Fox, S. Thrun, W. Burgard, F. Dellaert, Particle filters for mobile robot locali-
[6] Z.Z. Abidin, S.B.A. Hamid, A.A.A. Aziz, A.A. Malek, Development of a vision system zation, Sequential Monte Carlo Methods in Practice, Springer, 2001, pp. 401–428, ,
for a floor marking mobile robot, Proceedings of the 5th International Conference https://doi.org/10.1007/978-1-4757-3437-9_19.
on Computer Graphics, Imaging and Visualization, 2008, pp. 88–92, , https://doi. [20] H. Peel, S. Luo, A.G. Cohn, R. Fuentes, Localisation of a mobile robot for bridge
org/10.1109/CGIV.2008.69 Penang, Malaysia. bearing inspection, Autom. Constr. 94 (2018) 244–256, https://doi.org/10.1016/j.
[7] F. Inoue, S. Doi, E. Omoto, Development of high accuracy position making system autcon.2018.07.003.
applying mark robot in construction site, Society of Instrument and Control [21] X. Zhang, M. Li, J.H. Lim, Y. Weng, Y.W.D. Tay, H. Pham, Q.-C. Pham, Large-scale
Engineers Annual Conference 2011, 2011, pp. 2413–2414. Tokyo, Japan. INSPEC 3D printing by a team of mobile robots, Autom. Constr. 95 (2018) 98–106, https://
Accession Number: 12354330. [Online]. Available https://ieeexplore.ieee.org/ doi.org/10.1016/j.autcon.2018.08.004.
document/6060381 , Accessed date: 28 May 2019. [22] G. Campion, G. Bastin, B.D. Andrea-Novel, Structural properties and classification
[8] F. Inoue, E. Omoto, Development of high accuracy position marking system in of kinematic and dynamic models wheeled mobile robots, IEEE Trans. Robot.
construction site applying automated mark robot, Society of Instrument and Control Autom. 12 (1) (1996) 47–62, https://doi.org/10.1109/70.481750.
Engineers Annual Conference 2012, 2012, pp. 819–823. Akita, Japan. INSPEC [23] H.-C. Huang, W. Ter-Feng, Y. Chun-Hao, H.-S. Hsu, Intelligent fuzzy motion control
Accession Number: 13056258. [Online]. Available https://ieeexplore.ieee.org/ of three-wheeled omnidirectional mobile robots for trajectory tracking and stabi-
document/6318554 , Accessed date: 28 May 2019. lization, Proceedings of 2012 International Conference on Fuzzy Theory and Its
[9] E. Ohmoto, F. Inoue, S. Doi, Marking system applying automated robot for con- Applications, 2012, pp. 107–112, , https://doi.org/10.1109/iFUZZY.2012.6409684
struction site (in Japanese), Reports of the Technical Research Institute Obayashi- Taichung, Taiwan.
Gumi Ltd, No76, 2012, pp. 1–7 [Online]. Available http://warp.da.ndl.go.jp/ [24] ROS org Homepage, [Online]. Available http://wiki.ros.org/ja , Accessed date: 1
info:ndljp/pid/7860137/www.obayashi.co.jp/technology/shoho/076/2012_076_ April 2019.
33.pdf , Accessed date: 4 April 2019. [25] Z. Zhang, A flexible new technique for camera calibration, IEEE Trans. Pattern Anal.
[10] DPR construction, Laybot Shows Promise in Speed and Accuracy, [Online]. Mach. Intell. 22 (11) (2000) 1330–1334, https://doi.org/10.1109/34.888718.
Available https://www.dpr.com/media/news/2013, (2013) , Accessed date: 22 [26] Scicos Homepage, [Online]. Available http://www.scicos.org/ , Accessed date: 1
February 2018. April 2019.
[11] T. Kitahara, K. Satou, J. Onodera, Marking robot in cooperation with three-
11