Location via proxy:   [ UP ]  
[Report a bug]   [Manage cookies]                

Ocean Engineering: Hiroshi Kagemoto

Download as pdf or txt
Download as pdf or txt
You are on page 1of 11

Ocean Engineering 207 (2020) 107380

Contents lists available at ScienceDirect

Ocean Engineering
journal homepage: www.elsevier.com/locate/oceaneng

Forecasting a water-surface wave train with artificial intelligence- A


case study
Hiroshi Kagemoto
Nagasaki Institute of Applied Science, Japan

A R T I C L E I N F O A B S T R A C T

Keywords: With the advent of ‘Deep Learning’, Artificial Intelligence (AI) has attracted the attention of researchers in
Artificial intelligence various fields, including ocean engineering. This paper applies AI to forecasting a water-surface wave train.
Neural networks Recurrent neural networks (RNN) are used to forecast both actual wave trains and numerically reproduced
LSTM
irregular wave trains. The specific type of network used here is the Long Short-Term Memory (LSTM) model,
Forecasting time series
Forecasting ocean waves
which is known to have good properties for time series prediction. The methodology is extended to forecasting
Forecasting motions the motion responses of a floating body in an irregular wave train. The LSTM is found to generate reasonably
accurate forecasts, despite the nonlinearity of the data.

1. Introduction use it in predicting the future (Williams and Zipser, 1989). Connor et al.
(1994), using synthetic and real data, have shown the superiority of
With the advent of ‘Deep Learning’, Artificial Intelligence (AI) has RNN over the feedforward neural networks in predicting time series.
attracted renewed interest from researchers in various fields. There Among a variety of RNN approaches, the present study uses Long
should be a vast potential in the application of AI to ocean engineering. Short-Term Memory (LSTM), which can store information from the
A recent example is the use of AI in autonomous ship maneuvering (e.g. remote past as well as the recent past (Hochreiter and Schmidhuber,
gCaptain, 2018; Rolls Royce, 2018). The present work investigates the 1997).
use of AI to forecast a water-surface wave train. Specifically, artificial Application of deep learning methods in time series forecasting has
neural networks (ANN) are used to predict both actual irregular wave been seen in fields ranging from energy consumption (e.g., Ruiz et al.,
trains measured in the ocean and numerically reproduced irregular 2018), traffic flow (e.g., Ma et al., 2015; Zhao et al., 2017), and rainfall
wave trains. (Mishra et al., 2018) to stock prices (e.g., Nelson et al., 2017). The
It is understood that while wave trains exhibit nonlinear variability, literature on forecasting ocean waves consists primarily of two cate­
they are not random, and can be predicted to within a given threshold of gories. One is large-scale physics-based models, which simulate ocean
accuracy. In this respect, it is common practice to model ocean waves waves by solving physics-based equations (e.g., Molteni et al., 1996;
using a linear superposition of sinusoidal waves of different amplitudes. Rogers et al., 2007; Tolman, 2009; Reikard et al., 2011). The other is
In principle, if ocean waves could be represented by a linear super­ time series and statistical models (Abraham and Ledolter, 2009).
position of sinusoidal waves, it should be possible to forecast the next- Physics-based models are known to be effective for forecasting over
coming waves by decomposing the preceding wave train into compo­ longer time horizons, while statistical models work well for short-term
nent waves. However, this may take some time and may not predict prediction (Reikard and Rogers, 2011). AI and machine learning
rapidly enough for practical purposes. Moreover, highly nonlinear large methods are closer to the time series approach. In this respect, several
waves such as rogue waves (e.g., Dean, 1990; Clauss and Marco, 2009) prior works have used neural networks (Deo and Naidu, 1998; Jain and
cannot be predicted by linear superposition of component waves. Deo, 2008, Günaydin, 2008; Gopinath and Dwarakish, 2015; James
In applying artificial neural networks (ANN) for the prediction of et al., 2018). Most of these works focused on forecasting wave param­
time series, it is known that simple feedforward neural networks do not eters (significant wave heights, average periods, etc.) over horizons of
work well because they cannot take account of long-term dependence. In several hours to several days.
order to overcome this drawback, recurrent neural networks (RNN) have As for the prediction of individual wave heights, Mase et al. (2011)
been developed, These can store the information from past values, and investigated real-time prediction of each water-surface level in a

E-mail address: kagemoto_hiroshi@nias.ac.jp.

https://doi.org/10.1016/j.oceaneng.2020.107380
Received 30 December 2019; Received in revised form 9 April 2020; Accepted 12 April 2020
Available online 1 May 2020
0029-8018/© 2020 Elsevier Ltd. All rights reserved.
H. Kagemoto Ocean Engineering 207 (2020) 107380

tsunami train surging into Osaka Bay in Japan using a neural net. In the
present work, the goal is to predict the next wave height in a
wind-driven fully developed wave train. If the next wave height can be
predicted, this should have many practical applications. For example, it
would be useful in securing the safety of ships and offshore structures
such as oil drilling platforms. It is clearly relevant to wave farms.

2. Method of analyses

2.1. The wave trains subjected to the analyses

As the wave trains subjected to the present analyses, the following


two kinds of irregular wave trains were provided.
Fig. 2. Approximation of a wave power spectrum by the summation of N
1) Numerically produced irregular wave trains slender vertical strips.
2) Irregular wave trains measured in a real sea
positive x direction is produced numerically by the following
equation.
2.1.1. Numerically produced irregular wave trains
In numerically producing irregular wave trains in the present study, X
N
ζðt; xÞ ¼ ζi cosðki x ωi t εi Þ (2)
it is assumed that the waves have the following power spectrum SðωÞ, i¼1
which is known as Modified Pierson-Moskowitz Spectrum proposed by
ISSC (International Ship and Offshore Structures Congress). where ζðt; xÞ stands for the instantaneous free-surface displacement at
h i time t and at point x ¼ x.
2
SðωÞ ¼ 0:11H1=3 ω 1 ðω=ω0 Þ 5 exp 0:44ðω=ω0 Þ 4 (1) ζi represents the amplitude of the i-th component wave, which is
related to the segmented power spectrum as follows.
where H1=3 ,T0 represent respectively the significant wave height, the pffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffiffi
average period of the corresponding irregular wave train and ω0 ¼ ζi ¼ 2Sðωi ÞΔωi (3)
2π=T0 . The concrete values of H1=3 ,T0 assumed in the present work are Here Δωi is the width of the i th slender vertical strip and ki in Eq.
2:6 m and 6:0 sec: respectively. (These specific values of H1=3 ; T0 were (2) represents the wave number of the i-th component wave, which is
selected so that the values were about the same as those of the waves related to ωi in the case of large water depth as follows.
measured in a real sea, which will appear later in this paper.) The power
spectrum SðωÞ given by Eq. (1) with these values of H1=3 , T0 is shown in ωi 2
ki ¼ (4)
Fig. 1. g
Based on the power spectrum given by Eq. (1), the irregular wave
where g is the gravitational acceleration. (Assuming the water depth is
train that possesses the power spectrum is produced numerically in a
large, the relationship given by Eq. (4) was used for the present
computer in the following process.
analyses.)
The phase εi in Eq. (2) is chosen randomly in the range 0~2π .
(1) The power spectrum is approximated as the summation of N
slender vertical strips as shown in Fig. 2. (N ¼ 60 was used for the
2.1.2. Irregular wave trains measured in a real sea
present analyses.) The width of each strip is not set to be of equal
As the irregular wave trains measured in a real sea subjected to the
value but of unequal value using random numbers so that the
present analyses, those provided by courtesy of Professor Takuji Waseda
same pattern of an irregular wave train does not appear period­
of the University of Tokyo were used. The wave data were measured off
ically in the produced irregular wave train.
Kozu Island, one of the Izu Seven Islands lined up in the Pacific Ocean
(2) Using the unequally divided ωi ði ¼ 1; 2;⋯;NÞ, whereN represents
south of Tokyo, Japan, at 34o 15 4600 N; 139o 07 57:100 E. For the mea­
0 0

the number of segmentation of the corresponding power spec­


surement, a floating buoy equipped with GPS (Global Positioning Sys­
trum (see Fig. 2), the irregular wave train ζðt; xÞ progressing in
tem) moored by catenary mooring lines was used. Since the diameter of
the buoy was much smaller than the wavelengths of relevant ambient
waves, the vertical displacement of the buoy itself was used in the
present study as the free-surface displacement at the corresponding
point.

2.2. Mechanical learning and forecasting by RNN

Among machine learning methods, Long Short-Term Memory


(LSTM) networks, which are known to be effective for time-series ana­
lyses, were adopted using open-source TensorFlow software provided by
Google. The algorithms are available through the Keras library.

2.2.1. Mechanical learning by RNN


Mechanical learning by RNN was carried out in the following way.
The time series of waves is divided into two parts as shown in Fig. 3.
Fig. 1. Power spectrum of the irregular wave train subjected to the present The initial section is used as training data, while the remainder of the
analyses data set is used as test data. The data consists of the peak values
(Modified Pierson-Moskowitz Spectrum with H1=3 ¼ 2:6 m; T0 ¼ 6:0 sec.). (maximal values and minimal values). When the LSTM is trained, it

2
H. Kagemoto Ocean Engineering 207 (2020) 107380

Fig. 3. Division of a time series of waves into training data and test data.

analyzes a sequence of n peak heights, and predicts the next peak as First, the LSTM forecasts the 521st peak height from the 501st~
shown in Fig. 4(a). This process is repeated by shifting the sequence of n 520th peak heights.
peaks by one peak as shown in Fig. 4(b). This iterative training process
Second, the LSTM forecasts the 522nd peak height from the 502nd
was repeated up to the end of the training data.
~521st peak heights.
2.2.2. Forecasting by RNN Third, the LSTM forecasts the 523rd peak height from the 503rd
The same process mentioned in 2.2.1 is used in forecasting. The ~522nd peak heights.
LSTM forecasts the height of the next peak from the previous n peaks, as
shown in Fig. 5(a) and Fig. 5(b). ⋮
To be more specific, 40 (501th ~ 540th) peaks were used as test data. Finally, the LSTM forecasts the 540th peak height from the 520th
The forecasting process were conducted as follows:
~539th peak heights.

Fig. 4. Mechanical learning by RNN.

3
H. Kagemoto Ocean Engineering 207 (2020) 107380

Fig. 5. Forecasting of (nþ1)th peak height by RNN.

In effect, an adaptive learning of ‘rolling window forecasting’ this is referred to as Case-1).


methodology is used. The literature on rolling window forecasting Table 1 shows the correlation coefficients and root mean squared
provides rigorous criteria as to the number of prior data points that errors between the peak heights predicted by the LSTM and the real peak
should be used in each window (Inoue et al., 2017). The number of data heights. In Table 1, four different results based on four runs of the model
points used here, n ¼ 20, worked well empirically. Although the time are reported. In Table 1, 3 out of 4 correlation coefficients r are
duration in which 20 peaks appeared varies depending on the location, 0.65–0.80. Considering that, roughly speaking, when the correlation
in the data sets used here, it was about 50 s on average, which, in turn, coefficient is higher than 0.7, it can be said that there is a strong cor­
was about 2 times the relevant longest wave period. Some results of relation between the corresponding two values, it is rather impressive
parametric studies conducted in order to examine the appropriate that the peak heights can actually be predicted fairly well by this
window size while varying the window size n will be shown later in method.
Section 3.1. Among the four results shown in Table 1, picking up the result that
A part of the code used in the analyses which provides a better sense
of the architecture is shown in the Appendix.
Table 1
3. Results and discussion Correlation coefficients (upper row) and root mean squared errors (lower row)
between peak heights predicted by RNN and real peak heights (Case-1).
3.1. Numerically produced wave train 1 2 3 4 average

Case-1 0.811 0.648 0.658 0.515 0.658


First, the numerically produced wave trains shown in Fig. 6(a) were 0.086 0.120 0.127 0.145 0.119
analyzed. In the analyses, 500 peaks were used for training (hereafter

Fig. 6. Time series of numerically produced wave trains subjected to the analyses. (Time series of 1500 s duration are shown.).

4
H. Kagemoto Ocean Engineering 207 (2020) 107380

achieved the highest correlation coefficient r ¼ 0:811, the peak heights Table 2
predicted by RNN and the real peak heights are compared and correlated Sorting peak heights into classes.
in Fig. 7(a-1) and Figs. 7(a-2). In the figure, as is a common practice in peak height H(m) class
applying LSTM, both the predicted values and the real values of peak
⋮ ⋮
heights are normalized so that they fall in the range 0 � 1. In Figs. 7(a-
0.5�H<0.6 5
1), the time sequences of the real values and the predicted values are 0.4�H<0.5 4
compared, while in Figs. 7(a-2), correlations between the predicted 0.3�H<0.4 3
values and the real values are shown. In Figs. 7(a-1), triangle marks 0.2�H<0.3 2
indicate the real peak heights of the 521th ~ 540th peaks in the time 0.1�H<0.2 1
0.0�H<0.1 0
series of the irregular wave train shown in Fig. 6(a), while circle marks 0.1�H<0.0 1
indicate the corresponding peak heights predicted by the RNN. As can be 0.2�H<-0.1 2
observed in the figure, the circle and triangle marks are respectively 0.3�H<-0.2 3
joined by smooth lines to give the geometrical visual images of the 0.4�H<-0.3 4
0.5�H<-0.4 5
predicted wave train and that of the real one. (The spatial distance be­
⋮ ⋮
tween adjacent peaks is not accounted for in the figure but set to be
equal.) Again in Fig. 7, we can reconfirm the numerical fact that the peak
heights can be predicted fairly well by RNN.
Another possibility is to sort the wave heights into discrete classes as Table 3
shown in Table 2. Instead of letting the RNN learn and predict the exact Correlation coefficients (upper row) and root mean squared errors (lower row)
values of peak heights themselves, we let the RNN learn and predict between peak heights predicted by RNN and real peak heights (Case-2, Case-3).
which class the corresponding peak belonged to. Predicting actual peak 1 2 3 4 average
values may be more difficult for the neural net than predicting ranges or Case-2 0.689 0.782 0.664 0.658 0.698
classes. For instance, if instead of 500 different peak heights, the RNN is 0.134 0.094 0.113 0.139 0.120
required to lean and predict just dozens of classes, the author speculated Case-3 0.803 0.585 0.830 0.769 0.746
0.103 0.145 0.106 0.115 0.117
that it might be easier for the RNN both to learn and to predict. (From a
practical viewpoint also, in many cases forecasting ballpark values
instead of exact values may be enough.) of n, learning and prediction by RNN was conducted while varying the
Subjecting the two time series of wave trains shown in Fig. 6(a) and window size n systematically as n ¼ 5; 10; 20; 40. The results are
(b), the above mentioned attempt was conducted. The two different time summarized in Table 4. As in Table 3, four attempts were carried out for
series of wave trains shown in Fig. 6(a) and (b) have the same power each window size n. In the table, the correlation coefficient and the root
spectrum but different phases εi (see Eq. (2)). (Hereafter, the analyses mean squared error between peak heights predicted by RNN and real
conducted while subjecting the wave trains shown in Fig. 6(a) and in peak heights obtained in each attempt and the averaged values of those
Fig. 6(b) are called as Case-2 and Case-3 respectively.) In Table 3, cor­ of the four attempts are shown. In terms of both the correlation co­
relation coefficients and root mean squared errors between the peak efficients and root mean squared errors, no distinct differences among
heights predicted by RNN and the real peak heights are shown. In the the results obtained with various n are observed, even with the window
two cases, the averaged correlation coefficients are 0.698 and 0.746 size as small as n ¼ 5, the numerical facts suggest that fairly good pre­
respectively, which suggest peak heights can be predicted by RNN fairly dictions could already be attained. With the window size n ¼ 20, the
well, although, contrary to the expectation of the author, distinct correlation coefficient is actually the lowest with the average value r ¼
improvement of the prediction accuracy is not attained compared to 0.698, although it still suggests quite high correlation, but, on the other
Case-1. (Hereafter, as in Case-2 and Case-3, results obtained while hand, the root mean squared error is the lowest. Overall, no distinct
classifying the peak heights into classes are shown). conclusion can be drawn as to which window size should be used, but,
As for the comparison of prediction accuracies in Case-2 and those in the window size n ¼ 20 was used throughout the calculations con­
Case-3, no distinct differences are observed, which suggests, regardless ducted in the present study.
of the differences of the time series of wave trains subjected to the an­ In order to examine the effect of the number of training data, two
alyses, about the same good prediction accuracies can be attained. cases (Case-4, Case-5) were compared. Fig. 9 shows the time series of
Picking up the attempts in Table 3 that achieved the highest accu­ wave trains subjected to these analyses. As specified in the caption of
racies in Case-2 (r ¼ 0.782) and in Case-3 (r ¼ 0.830), the peak heights Fig. 9, time series of 3000 s duration, which is twice the time duration of
predicted by RNN and the real peak heights are compared and correlated the time series shown in Fig. 6, was subjected to the analyses. In Case 4,
in Fig. 8(a) and Fig. 8(b). 1000 peak heights were used for the mechanical learning of RNN, while,
In order to examine the appropriate window size, that is the number in Case-5, 500 out of the 1000 peak heights were used for the mechanical

Fig. 7. Comparisons between peak heights predicted by RNN and real peak heights
(Case-1) (r: correlation coefficient).

5
H. Kagemoto Ocean Engineering 207 (2020) 107380

Fig. 8. Comparisons between peak heights predicted by RNN and real peak heights
(Case-2, Case-3) (r: correlation coefficient).

Table 4 Table 5
Correlation coefficients (upper row) and root mean squared errors (lower row) Correlation coefficients (upper row) and root mean squared errors (lower row)
between peak heights predicted by RNN and real peak heights obtained with between peak heights predicted by RNN and real peak heights (Case-4, Case-5).
various window size.n 1 2 3 4 average
n 1 2 3 4 average
Case-4 0.684 0.674 0.661 0.498 0.629
5 0.711 0.812 0.656 0.641 0.705 0.128 0.125 0.125 0.149 0.131
0.143 0.121 0.152 0.160 0.144 Case-5 0.588 0.511 0.593 0.582 0.568
10 0.659 0.831 0.760 0.789 0.759 0.144 0.143 0.147 0.149 0.145
0.147 0.113 0.137 0.123 0.130
20 0.689 0.782 0.664 0.658 0.698
0.134 0.094 0.113 0.139 0.120 Since, from practical viewpoint, even if you can forecast that a large
40 0.593 0.854 0.793 0.675 0.728 wave is coming next, there may be no time or no ways to prepare for the
0.180 0.089 0.111 0.132 0.128
incoming large wave, an attempt was made in which the height of the
ðn þ5Þ th peak, instead of the height of the ðn þ1Þth peak, was forecasted
learning. In both cases, the same test data were subjected to the pre­ by RNN from the peak heights of the preceding n peaks. (This attempt
diction of peak heights by the RNN. was carried out while using the numerically produced wave train shown
In Table 5, correlation coefficients and root mean squared errors in Fig. 6(a) and hereafter it is called as Case-6.) In Table 6, correlation
between the peak heights predicted by RNN and the real peak heights coefficients and root mean squared errors between the peak heights
obtained in Case-4 and in Case-5 are shown. From the table, slight im­ predicted by RNN and the real peak heights are shown. It is observed
provements in both the correlation coefficients and the root mean that the correlation coefficients in the 4 attempts conducted could still
squared errors seem to be achieved by increasing the number of training be higher than 0.6. (In one of the four attempts shown in the table,
data, which is consistent with our intuition, although the improvements correlation coefficient is as low as 0.382, which is markedly lower than
are not distinct. the other three values. If this value is omitted in calculating the average
value, average correlation coefficient is 0.700.) Picking up the attempt
that resulted in the highest correlation coefficient r ¼ 0:786, the peak
heights predicted by RNN and the real peak heights are compared and
correlated in Fig. 10. Although this is just an example calculation con­
ducted for a particular case, it is interesting that the height of the
ðn þ5Þ th peak, instead of the height of the ðn þ1Þth peak, could be
forecasted by RNN fairly well from the peak heights of the preceding n
peaks.

Table 6
Correlation coefficients (upper row) and root mean squared errors (lower row)
between peak heights predicted by RNN and real peak heights (Case-6).
1 2 3 4 average

Case-6 0.786 0.695 0.382 0.620 0.620


Fig. 9. Time series of wave trains subjected to the analyses of Case-4, Case-5
0.102 0.123 0.170 0.122 0.129
(Time series of 3000 s duration are shown.).

6
H. Kagemoto Ocean Engineering 207 (2020) 107380

Fig. 10. Comparisons between peak heights predicted by RNN and real peak heights
(Case-6) (r: correlation coefficient).

3.2. Wave trains measured in a real sea numerically producing wave trains were selected so that the values were
about the same as those of the waves measured in a real sea shown in
As mentioned in 3.1, the results of the analyses conducted using the Table 7.) The analyses conducted while using each of the 3 different
numerically produced wave trains suggest that the immediate next wave wave time series shown in Fig. 11(a),(b),(c) are hereafter called as Real-
peak height could actually be predicted fairly accurately by RNN. 1, Real-2, Real-3 respectively.
However, there exist certain differences between wave trains in a real In Table 8, correlation coefficients and root mean squared errors
sea and those numerically produced in a computer in this study. To between the peak heights predicted by RNN and the real peak heights in
name a few, the three cases (Real-1, Real-2, Real-3) are shown. As can be observed in
the table, it turned out that the correlation coefficients could still be
(1) Wave trains in a real sea are not necessarily unidirectional ones fairly high even in wave trains in a real sea. Particularly in the case Real-
that are expressed by Eq. (2), or rather they more or less consist of 3, all the correlation coefficients obtained in the four attempts are quite
multidirectional wave trains. as high as around 0.8. Picking up the result that showed the highest
(2) Even if the wave train is unidirectional, it is dubious, if the wave correlation coefficient in Table 8, that is, r ¼ 0.857 obtained in Real-3,
train in a real sea can be expressed as a linear superposition of the peak heights predicted by RNN and the real peak heights are
regular wave trains as in Eq. (2). compared and correlated in Fig. 12(a-1) and Figs. 12(a-2). Other than
(3) Even if the wave train can be expressed as in Eq. (2), there is no the fact that even the wave trains in a real sea could be predicted fairly
guarantee that the sea state expressed by Eq. (2) remains to be the well by RNN, another interesting fact known from the results shown in
same while the RNN is mechanically learning. Table 8 is that, in terms of correlation coefficients, the wave trains
(4) Waves in a real sea are not necessarily those produced by wind measured at 22:00–22:20, that is, Real-3 are apparently more predict­
but they can be produced by other causes such as seaquakes, able than the other two wave trains measured at 00:00–00:20 and at
landslides, astronomical tide and so on. 12:00–12:20 in the same day, which may be suggesting that the struc­
ture of the wave trains has somehow changed distinctively by 22:00. On
These are just a few possible differences between numerically pro­ the other hand, it is also noticeable that, in terms of root mean squared
duced wave trains expressed in errors, the corresponding values in the case Real-3 are not as good as the
Eq. (2) and wave trains in a real sea which the author could come up correlation coefficients, or the root mean squared errors in the case Real-
with. There could be other differences between the two wave trains. 3 are even quite larger than those of the case Real-2, although the cor­
Therefore, although the results obtained in 3.1 using numerically relation coefficients of the case Real-3 are significantly higher than those
produced wave trains suggest that the immediate future wave peak of the case Real-2.
height could be forecasted fairy accurately by RNN, there is no guar­
antee that it is also the case in wave trains in a real sea. 4. Forecasting motion responses of a floating structure in an
Under these concerns, the same analyses as those conducted in 3.1 irregular wave train
were carried out while subjecting the wave trains measured in a real sea
to the analyses. So far, forecasting of an irregular wave train has been dealt with, but,
Measured wave time series subjected to the present analyses are here, the feasibility of forecasting motion responses in an irregular wave
shown in Fig. 11, and the statistical values, that are, significant wave train is examined. In some practical purposes, forecasting motions rather
height H1=3 and average period T0 , of these wave trains are summarized than forecasting waves may be preferable. To name a few, if you want to
in Table 7. (The specific values of H1=3 ¼ 2:6m,T0 ¼ 6:0s assumed in control motions of an ocean-energy harvesting device for effective

Fig. 11. Wave time series off Kozu Island in Japan measured on Oct. 4th, 2010
(Time series of 1200 s duration are shown.).

7
H. Kagemoto Ocean Engineering 207 (2020) 107380

Table 7 Table 8
Statistical values of significant wave heights and average wave periods of the Correlation coefficients (upper row) and root mean squared errors (lower row)
wave trains measured off Kozu Island in Japan. between peak heights predicted by RNN and real peak heights obtained by using
Time H1=3 ðmÞ T0 ðsÞ
the actual wave trains measured off Kozu Island in Japan (Real-1, Real-2, Real-
Oct. 4th, 2010 3).

00:00–00:20 2.46 5.78 Time 1 2 3 4 average


12:00–12:20 2.79 6.17 Real-1 00:00–00:20 0.627 0.678 0.575 0.800 0.670
22:00–22:20 2.71 6.01 0.189 0.177 0.206 0.149 0.180
Real-2 12:00–12:20 0.721 0.547 0.422 0.546 0.559
0.107 0.122 0.147 0.133 0.127
extraction of ocean energy, or if you want to control relative motions Real-3 22:00–22:20 0.847 0.823 0.752 0.857 0.819
between a maintenance ship and a floating offshore wind-power 0.141 0.140 0.185 0.163 0.157
generator, direct forecasting of motions themselves rather than fore­
casting waves may be more preferable. Besides, accurate on-board
xj :motion displacement in j-th direction
measurement of ambient waves is quite difficult for moving ships or
Mℓj :mass or mass moment of inertia in ℓ-th motion ðM11 ¼ M22 ;
other floating structures, while accurate on-board measurement of mo­
Mℓj ¼ 0 if ℓ 6¼ jÞ
tions of the corresponding ship or the corresponding floating structure
will be relatively easy. Aℓj :added mass or added mass moment of inertia in ℓ-th motion due
In order to investigate the feasibility of forecasting motion responses to j-th motion
of a floating structure in an irregular wave train, a 2-D floating structure Nℓj :damping coefficient in ℓ-th motion due to j-th motion
of rectangular cross section shown in Fig. 13 was subjected to the pre­ kℓj :restoring force/moment coefficient in ℓ-th motion due to j-th
sent study. motion
Fwℓ :wave force/moment in ℓ-th direction

4.1. Calculation of motions


In principle, the equations of motions that describe the time-domain
motions should be expressed with convolution integral accounting for
4.1.1. Equations of motions
the memory effect due to the waves induced by the motions (Cummins,
The equations of motions used in the present calculations are as
1962). In the present computations, however, for the sake of simplicity,
follows.
we assume that the equations of motions can be expressed by Eq. (5), in
X
3
� � � which hydrodynamic coefficients ðAℓj ; Nℓj Þ at certain representative
Mℓj þ Aℓj x€j þ Nℓj x_j þ kℓj xj ¼ Fwℓ ðℓ ¼ 1 � 3Þ (5) motion period are taken. (In the present study, the average period T0 of
the irregular wave train was chosen as the representative motion
j¼1

period.)
Here
As for the irregular wave train used in the present calculations, the
ℓ ¼ 1; 2 represent translational motions in x direction (surge), z
same one as that expressed by Eq. (2) was used.
direction (heave) respectively. ℓ ¼ 3 represents rotational motion
In waves expressed by Eq. (2), the wave force/moment Fwℓ ðtÞ are
around y axis (pitch).
written as follows.
(As for the definition of axes, see Fig. 13.) and

Fig. 12. Comparisons between peak heights predicted by RNN and real peak heights
(Real-3) (r: correlation coefficient).

Fig. 13. A 2-D floating body in an irregular wave train. (L¼10.0m, d¼2.0m)

8
H. Kagemoto Ocean Engineering 207 (2020) 107380

X
N Table 9
Fwℓ ðtÞ ¼ Hwℓ ðωi Þ⋅ζi cosðki x ω i t þ ε wℓ þ ε i Þ (6) Correlation coefficients (upper row) and root mean squared errors (lower row)
i¼1
between peak heights of heave motions predicted by RNN and real peak heights
of heave motions (Motion-1).
Here, Hwℓ ðωi Þ; εwℓ ðωi Þ represent respectively the amplitude and the
phase of the ℓ th mode wave force/moment in regular waves of unit 1 2 3 4 average

amplitude and frequency ωi . Motion-1 0.875 0.902 0.904 0.915 0.899


0.078 0.078 0.071 0.072 0.074
4.1.2. Numerical time integration
Time series of motions x1 ; x2 ; x3 are obtained by numerically inte­
relevant heave motions take place is distinctively narrower than that of
grating the set of the three simultaneous differential equations (Eq. (5) waves, the Fourier spectrum of the corresponding heave motions, which
in time.
is the product of RAO (Response Amplitude Operator) of the heave
Among various existing numerical time-wise integration methods, motions and the Fourier spectrum of the waves, results in narrower
Newmark β method with β ¼ 1=6 was used in the present study.
banded. (RAO is the motion amplitude in regular waves of unit ampli­
tude.) This fact should not be the case only in the particular case dealt
4.2. Mechanical learning and forecasting by RNN with in the present study, because, in general, motions induced by
regular waves diminish quite rapidly as the wave frequency increases.
In Fig. 14(b), the numerically obtained time series of heaving mo­ If the corresponding time series are narrow banded, the number of
tions of the 2-D floating body in the irregular wave train depicted in relevant regular motions/waves composing the time series are fewer
Fig. 14(a) is shown. 500 peaks in the time series of the heave motions than those of time series of wider banded, and this, in turn, implies the
were used for mechanical learning of RNN, and the immediately irregularity of the corresponding time series is simpler and thus should
following 40 peaks were subjected to the examination of the predict­ be easier for RNN to decipher the hidden regularity of the time series.
ability of heave-motion peaks. The procedures of mechanical learning (Actually, the author once examined the predictability of an irregular
and prediction by RNN are the same as those described in Section 2.2. wave train composed of only 3 regular waves of different frequencies
with RNN, and found that the RNN could forecast the next-coming
4.3. Results and discussion waves almost perfectly.)

In Table 9, correlation coefficients and root mean squared errors 5. Conclusions


between the peak heights of heave motions predicted by RNN and the
real peak heights of heave motions are shown. (Hereafter, this case is In the present study, feasibility of forecasting the peak heights of
called as Motion-1.) Apparently the correlation coefficients shown in the next-coming waves from the known peak heights of a preceding wave
table are higher than those obtained in forecasting waves. 3 out of 4 train by use of recurrent neural networks has been investigated. The
correlation coefficients shown in the table are higher than 0.9, and even investigation was carried out using both irregular wave trains produced
the remaining one is close to 0.9. The average value of the 4 correlation in a computer and those measured in a real sea. Other than forecasting
coefficients is as high as 0.899. As for the root mean squared errors, they waves, the present study was further extended to the feasibility of
are consistently as low as 0.07–0.08 in the four attempts as well. Picking forecasting next motions of a floating body in an irregular wave train.
up the attempt in Table 9 that achieved the highest correlation coeffi­ Although the results here are preliminary, they are encouraging.
cient (r ¼ 0.915), the peak heights of heave motions predicted by RNN Even the peak heights of a wave train in a real sea turned out to be
and the real peak heights are compared and correlated in Fig. 15(a-1) reasonably predictable by the RNN. Perhaps surprisingly, the RNN could
and Figs. 15(a-2). forecast motions with even higher accuracy than waves. Some plausible
The reason why the RNN could forecast heave notions better than reasons why motions can be predicted with higher accuracy than waves
waves may be interpreted as follows. were examined. Since the process of mechanical learning in the RNN is
Fig. 16(a) and Fig. 16(b) show the Fourier spectra of the wave train based on exploring regularities in the wave train data sets, the meth­
and that of heaving motions obtained by analyzing the corresponding odology may also be applicable to predicting such highly nonlinear
time series depicted in Fig. 14(a) and (b) respectively. (The Fourier phenomena as ‘freak waves’ or ‘rogue waves’ (e.g. Clauss and Marco,
analyses were conducted while using the statistical analysis tools 2009). Predicting anomalous or extreme events using machine learning
equipped in Microsoft Excel. The number of data used in the present has been seen in some other fields (Na €rhi et al., 2018.)
analyses was 4096.) In Fig. 16(a), (b), we can observe that the Fourier Since results of many cases are shown in the present article, all the
spectrum of the heave motions is distinctively narrower banded than cases dealt with in the article are summarized in Table 10.
that of waves. Since, as shown in Fig. 17, the frequency range in which

Fig. 14. Time series of waves and heave motions. (Time series of 1000 s duration are shown.).

9
H. Kagemoto Ocean Engineering 207 (2020) 107380

Fig. 15. Comparisons between peak heights of heave motions predicted by RNN and real peak heights
(Motion-1) (r: correlation coefficient).

Fig. 16. Fourier spectra of waves and heave motions.

Table 10
Summary of the cases dealt with in the present analyses
(In Case-1, peak heights themselves were subjected to the analyses, while, in the
other cases (Case-2 ~ Case-6, Real-1 ~ Real-3, Motion-1), peak heights were
sorted into classes as shown in Table 2, and, instead of peak heights themselves,
the RNN learned and forecasted to which class the corresponding peak
belonged.).
Case-1 Using the wave time series shown in Figs. 6 (a), 500 peaks were used for
mechanical learning of RNN.
Case-2 Using the wave time series shown in Figs. 6 (a), 500 peaks were used for
mechanical learning of RNN.
Case-3 Using the wave time series shown in Figs. 6 (b), 500 peaks were used for
mechanical learning of RNN.
Case-4 Using the wave time series shown in Figs. 9, 1000 peaks were used for
mechanical learning of RNN.
Case-5 Using the wave time series shown in Figs. 9, 500 peaks were used for
mechanical learning of RNN.
Fig. 17. RAO (response amplitude operator) of heave motions. Case-6 (nþ5)-th peak height, instead of (nþ1)-th peak height, was predicted
from the preceding n peak heights.
Real-1 Actual wave time series shown in Fig. 11(a) as Real-1 were used for the
analyses. 500 peaks were used as training data.
Real-2 Actual wave time series shown in Fig. 11(b) as Real-2 were used for the
Declaration of competing interest analyses. 500 peaks were used as training data.
Real-3 Actual wave time series shown in Fig. 11(c) as Real-3 were used for the
The authors declare that they have no known competing financial analyses. 500 peaks were used as training data.
interests or personal relationships that could have appeared to influence Motion- Motion peak heights were predicted by RNN.
1
the work reported in this paper.

CRediT authorship contribution statement

Acknowledgements
Hiroshi Kagemoto: Data curation, Formal analysis, Investigation,
Writing - original draft.
The author acknowledges that the wave data measured in a real sea
were provided by courtesy of Professor Takuji Waseda of the University
of Tokyo. The real-sea wave measurement was conducted by Professor
Waseda as part of the work consigned by Mitsui E&S in a research

10
H. Kagemoto Ocean Engineering 207 (2020) 107380

project financially supported by NEDO (New Energy and Industrial Günaydin, K., 2008. The estimation of monthly mean significant wave heights by using
artificial neural network and regression methods. Ocean Eng. 35 (Issues 14–15),
Technology Development Organization) in Japan.
1406–1415.
Hochreiter, S., Schmidhuber, J., 1997. Long short-term memory. Neural Comput. 9
Appendix (No.8), 1735–1780.
Inoue, A., Jin, L., Rossi, B., 2017. Rolling window selection for out-of-sample forecasting
with time-varying parameters. J. Econom. 196, 55–67.
The followings are the part of the actual program code written and James, S.C., Zhang, Y., O’Donncha, F., 2018. A machine learning framework to forecast
used by the author for the analyses presented, which may give some wave conditions. Coast. Eng. 137, 1–10.
concrete ideas of the architecture of the artificial neural networks used. Jain, P., Deo, M.C., 2008. Artificial intelligence tools to forecast ocean waves in real time.
Open Ocean Eng. J. 1, 13–20.
(The program code is written in Python.) Ma, X., Tao, Z., Wang, Y., Yu, H., Wang, Y., 2015. Long short-term memory neural
In the followings, train_data_x, train_data_y are those subjected to the network for traffic speed prediction using remote microwave sensor data. Transport.
mechanical learning, while test_data_x, test_data_y are those subjected to Res. C Emerg. Technol. 54, 187–197.
Mase, H., Yasuda, T., Mori, N., Sept. 2011. Real-time prediction of Tsunami magnitudes
the prediction. in Osaka Bay, Japan, using an artificial neural network. J. Waterw. Port, Coast.
model ¼ Sequential() Ocean Eng. 263–268.
model.add(LSTM(20, batch_input_shape¼(None, 20, 1))) Mishra, N., Soni, H.K., Sharma, S., Upadhyay, A.K., 2018. Development and analysis of
artificial neural network models for rainfall prediction by using time-series data. I.J.
model.add(Dense(1)) Intell. Syst. Appl. 16–23.
model.add(Activation(‘Sigmoid’)) Molteni, F., Buizza, R., Palmer, T.N., Petroliagis, T., 1996. The ECMWF ensemble
optimizer ¼ adam(lr ¼ 0.001) prediction system: methodology and validation. Q. J. Roy. Meteorol. Soc. 122 (Issue
529), 73–119.
model.compile(optimizer ¼ optimizer, loss ¼ ’mean_squared_error’,
N€arhi, M., Salmela, L., Toivonen, J., Billet, C., Dudley, J.M., Genty, G., 2018. Machine
metrics ¼ [‘accuracy’]) learning analysis of extreme events in optical fibre modulation instability. Nat.
# mechanical learning Commun. 9, 4923. Article number.
model.fit(train_data_x, train_data_y, epochs ¼ 4000, batch_size ¼ 20) Nelson, D.M.Q., Pereira, A.C.M., de Oliveira, R.A., 2017. Stock market’s price movement
prediction with LSTM neural networks. Proc. Int. Joint Conf. Neural Network.
# prediction 1419–1426.
test_data_y ¼ model.predict(test_data_x) Reikard, G., Rogers, W.E., 2011. Forecasting ocean waves: comparing a physics-based
model with statistical models. Coast. Eng. 58 (Issue 5), 409–416.
Reikard, G., Pinson, P., Bidlot, J.-R., 2011. Forecasting ocean wave energy: the ECMWF
References wave model and time series methods. Ocean Eng. 38 (Issue 10), 1089–1099.
Rogers, W.E., Kaihatu, J.H., Hsu, L., Jensen, R.E., Dykes, J.D., Holland, K.T., 2007.
Abraham, B., Ledolter, J., 2009. Statistical Methods for Forecasting. John Wiley & Sons. Forecasting and hindcasting waves with the SWAN model in the Southern California
Clauss, G.F., Marco, K., 2009. The new year wave: spatial evolution of an extreme sea Bight. Coast. Eng. 54 (Issue 1), 1–15.
state. J. Offshore Mech. Arctic Eng. 131 (Issue 4), 1–9. Rolls Royce, 2018. World’s First Autonomous Ferry. https://www.rolls-royce.com
Connor, J.T., Martin, R.D., Atlas, L.E., 1994. Recurrent neural networks and robust time /media/press-releases/2018/03-12-2018-rr-and-finferries-demonstrate-worlds-firs
series prediction. In: IEEE Transactions on Neural Networks, vol. 5, pp. 240–254. t-fully-autonomous-ferry.aspx.
No.2. Ruiz, L.G.B., Rueda, R., Cu�ellar, M.P., Pegalajar, M.C., 2018. Energy consumption
Cummins, W.E., 1962. The impulse response function and ship motions. Schiffstechnik 9, forecasting based on Elman neural networks with evolutive optimization. Expert
101–109. Syst. Appl. 92, 380–389.
Dean, R.G., 1990. Freak waves: a possible explanation. In: Tørum, A., Gudmestad, O.T. Tolman, H.L., 2009. User manual and system documentation of WAVEWATCH-III version
(Eds.), Water Wave Kinematics. NATO ASI Series (E: Applied Sciences), vol. 178. 3.14, p. 219pp. NOAA/NWS/NCEP/MMAB Tech. Note 276.
Springer, Dordrecht, pp. 609–612. Williams, R.J., Zipser, D., 1989. A learning algorithm for continually running fully
Deo, M.C., Naidu, C.S., 1998. Real time wave forecasting using neural networks. Ocean recurrent neural networks. Neural Comput. 1, 270–280.
Eng. 26 (Issue 3), 191–203. Zhao, Z., Chen, W., Wu, X., Chen, P.C.Y., Liu, J., 2017. LSTM network: a deep learning
gCaptain, 2018. World’s first autonomous shipping company established in Norway. http approach for short-term traffic forecast. IET Intell. Transp. Syst. 11 (Issue 2), 68–75.
s://gcaptain.com/worlds-first-autonomous-shipping-company-coming-to-norway/.
Gopinath, D.I., Dwarakish, G.S., 2015. Wave prediction using neural networks at New
Mangalore Port along west coast of India. Aquat. Procedia 4, 143–150.

11

You might also like