International Journal of Hybrid Information Technology, Jul 31, 2016
Selangor is an important river basin in adjacent to the city of Kuala Lumpur, the federal capital... more Selangor is an important river basin in adjacent to the city of Kuala Lumpur, the federal capital of Malaysia and it supplies about 70% of the water required for domestic and industrial use for the city. Selangor river basin is presently regulated by two water supply dams, namely the Tinggi dam and the Selangor dam. Water is abstracted at an intake located 21 and 42 km downstream of the Tinggi and Selangor dam respectively. In the wet season, when unregulated flows downstream of the dams are sufficient for abstraction, no releases from the dams are required. However, releases are required in the dry season when flows downstream fall below the normal level. The present practice in dam operation is to use recession analysis in low flow forecasting during prolonged dry periods. Recession constants were derived using stream flow data and future flows were forecasted using the current flow and the recession constants assuming that there is no rain for the coming period where forecasts were made. Decisions were then made for releases from the dams. The disadvantage of recession analysis in forecasting low flow is that the forecast is not accurate if rain falls during the period and over release will occur. This study reports the use of Artificial Neural Network (ANN) models to forecast one and two time steps ahead river flows at the Rantau Panjang gauging station near the water supply intake for different travel times from the dams to the intake point to help in determining the regulating releases from the dams for more efficient reservoir operation. Two different ANN models, the Multi -Layer Perceptron (MLP) and the General Regression Neural Network (GRNN), were developed and their performances were compared. Endogenous and exogenous input variables such as stream flow and rainfall with various lags were used and compared for their ability to make future flow predictions. The input variables required are decided considering statistical properties of the recorded rainfall and flow such as cross-correlation between flow and rainfall, auto and partial autocorrelation of the flows which are best in representing the catchment response. Results show that both methods perform well in terms of R² but GRNN models generally give lower RME and MAE values indicating their superiority compared to MLP models.
The Journal of The Institution of Engineers, Malaysia, 2016
Flood estimations based on itting the frequency of occurrence of annual peak discharges using the... more Flood estimations based on itting the frequency of occurrence of annual peak discharges using the Log-Pearson Type 3 distribution are commonly used but they are sensitive to the skew coeficients of the gauging stations. The estimation accuracy can be improved by using a weighted average population skew coeficient calculated from the sample station skew and the generalised unbiased skew. The U.S. Water Resources Council (WRC) has documented guidelines for estimating the generalised skew coeficients and published a map of generalised skew values for the United States. The map shows isolines of skew coeficient values and the average skew coeficient for each 1-degree quadrangle of latitude and longitude for the United States. Following the WRC guidelines, many of the state authorities in the US have developed the generalised skew coeficients separately on a state/regional basis. In Malaysia, the Log Pearson Type 3 distribution has been widely used for lood peakestimation but there are n...
International Journal of Hybrid Information Technology, 2016
This study evaluates the use of Multi-Layer Perceptron (MLP) neural network models to forecast wa... more This study evaluates the use of Multi-Layer Perceptron (MLP) neural network models to forecast water levels of a gauging station located at the Kuala Lumpur city centre in Malaysia using records of upstream multiple stations. Cross correlation analysis of water level data was performed to determine the input vectors which include the current and antecedent water level data of the upstream stations to ensure that of the data available, the most influential values corresponding to different lags are selected for analysis. Twelve well recorded storm events were used to train, test and validate the MLP models. The best performance based on MSE, MAE and R² was achieved with a model of 15 input vectors of upstream current and antecedent water levels, 7 hidden nodes and an output vector for the station at Kuala Lumpur centre. The R² values for training, testing and validation datasets are 0.81,0.85 and 0.85 respectively.
The Journal of The Institution of Engineers, Malaysia, 2018
Temporal and spatial variations of a flood hydrograph moving through a river reach can be simulat... more Temporal and spatial variations of a flood hydrograph moving through a river reach can be simulated using flood routing tools such as hydrodynamic, hydrological and the ANN (Artificial Neural Networks) models. The ANN models have emerged as viable tools in flood routing and are widely adopted for this purpose. The aim of this study is to make an objective comparison of these two flood routing models to evaluate their individual performance. Four flood events recorded for Klang river at Kuala Lumpur in the period October 1973 to December 1974 for stations at Leboh Pasar and Sulaiman Bridge which are 950m apart were used for this study. The statistical performance of the models is assessed using criteria such as peak flow, root mean square error, mean absolute error and Nash –sutcliffe coefficient. Results from calibration runs for the 02/05/1974 flood event show that the MAE, RMSE and NAE for ANN and Muskingum models are 0.75,1.24,0.9917 and 1.1,1.3, 0.992 respectively. The performan...
International Journal of Hybrid Information Technology, 2016
Water from the Selangor river basin is the main source of supply for the federal capital of Kuala... more Water from the Selangor river basin is the main source of supply for the federal capital of Kuala Lumpur and the various districts of Selangor State located in the centre of Peninsular Malaysia since 2000. As this region is the most populated area in the country, its rapid economic development and population growth have caused concern over the adequacy of the quantity and quality of water abstracted from the Selangor basin, both at present and in the future. A recent prolonged drought has caused a one-month water rationing in the Kuala Lumpur and adjacent areas which basically has seriously affected the everyday life of the people and the industrial and agricultural sectors. A rainfall drought analysis is therefore required for assessing the severity of the drought events in the study area. In this study, L-moments have been used to compute the rainfall quantile values for 9 probabilities, 6 durations, 12 starting months for the 2 regions across the Selangor basin. The choice of the Pearson Type III and Wakeby distributions for fitting the Selangor rainfall data is presented. Quantile values are expressed as a percentage of mean rainfall of the particular duration and presented as drought indication maps. Rainfall of specific return period can be calculated easily using these maps and the mean rainfall for the particular station. The return period (severity) of a historical rainfall event from the station can be known when the magnitude of the historical event is compared with that of the rainfall event of a particular return period. This helps in monitoring and management of droughts.
The Journal of The Institution of Engineers, Malaysia, 2018
Rainfall temporal patterns are needed as inputs for hydrologic models such as unit hydrograph or ... more Rainfall temporal patterns are needed as inputs for hydrologic models such as unit hydrograph or runoff routing method used in the derivation of flood hydrographs. The patterns adopted can have a major effect on the resulting flood computed. Short and long duration rainfall data are both required for different sizes of catchments to determine and locate the flood producing critical storms in flood estimation. Design temporal patterns with different durations are therefore also required for distributing the storm rainfall in flood calculations. Patterns for a large number of durations with reasonably short time intervals are needed by designers to reduce the need for interpolation and to maintain the accuracy in obtaining the peak of the hydrograph. In this study, pluviograph data for the Upper Klang Catchment with records of over 30 years are used to derive temporal patterns for 20 standard durations as per ARR87. Rainfall temporal patterns for the upper Klang were derived for rainf...
The Journal of The Institution of Engineers, Malaysia, 2018
In this Study, the Muskingum and Lag river routing models were used to estimate the routing coeff... more In this Study, the Muskingum and Lag river routing models were used to estimate the routing coefficients of Klang river using flood records of gauging stations at Leboh Pasar and Sulaiman Bridge, for a river reach of 900m between the two gauging stations. Recorded inflow hydrographs at Leboh Pasar and outflow hydrographs at Suilaman Bridge were used to calibrate and validate the Muskingum and Lag routing models. Best fit routing parameters for the models were derived using the optimization module of HEC-HMS, and the average parameters obtained were used to validate the routing models using different sets of flood events. The outflow hydrographs at Sulaiman Bridge derived from routing models were used to compare with the observed hydrographs by plotting the hydrographs for visual inspections. The ability to produce the observed flow at the Sulaiman Bridge by the routing models were also assessed statistically by calculating the goodness of fit indices from the routed and observed hyd...
A set of quality control streamflow data is always required in the planning, design and managemen... more A set of quality control streamflow data is always required in the planning, design and management of water resources projects. Although every effort has been made by the authority in the collection of complete and continuous hydrological data such as rainfall and streamflow, gaps and incomplete data sets with inadequate length are always encountered, as is always the case. These can be due to faulty field instruments, the occurrence of natural disasters and other reasons. Over the years, various techniques have been developed to infill the missing data, especially the streamflow data. These techniques include regression analysis ,rainfall runoff modelling and the use of artificial neural networks(ANN) data driven models. In this study, the HEC-HMS model is used to simulate long term daily streamflow of Sg Melaka. The process involved using recorded flow and rainfall data of 1989-1992 to calibrate the model and the model validation using records of 1985-1986. Results show that the m...
Data from 4 pluviograph stations in the upper Klang basin were used to derive time distributions ... more Data from 4 pluviograph stations in the upper Klang basin were used to derive time distributions of heavy rainfall using the NOAA method. Rainfall cases for the temporal distribution analysis were selected from the annual maximum series usually used in the rainfall frequency analysis. Each case (i.e., maxima) was the total accumulation over a selected duration (1,6 ,12, 24 hour for this study). For each rainfall case, cumulative rainfall amounts were converted into percentages of the total rainfall amount at specified time increments. All cases for a specific duration were then combined and these rainfall cases were analysed separately, determining the percentage accumulated to 10, 20,30,40,50,60,70,80,90 and 100% of its total duration. For each duration, the percentage was determined by a percentage series of total rainfall, and the probabilities calculated. In order to obtain the values of rainfall based on above definition, linear interpolations were carried out between the probabilities and the immediately previous and subsequent probabilities. The temporal distribution curves for nine deciles (10% to 90%) were plotted in the same graph. Results show that first-quartile and second-quartile storms occurred most frequently with durations less than or equal to 12 hours; and first-quartile and fourth quartile storms most often had durations of 24 hours. Following the principles of Huff (1990), the temporal distribution curves derived in this study are recommended to be used for normal design as follows: • For 1 hour duration, it is recommended that second quartile relations be used to establish typical time distributions. • Time distributions for storms lasting 6 hours ,12 hours and 24 hours are most likely to conform to a first-quartile distribution. For most purposes, the median curves are probably most applicable to design. These curves are more firmly established than the more extreme curves, such as those for the 10% and 90% probability levels, which are determined from a relatively small portion of each quartile's sample. However, the extreme curves should be useful when runoff estimates are needed for the occurrence of unusual storm conditions, such as typified by the 10% curves.
2012 9th International Conference on Fuzzy Systems and Knowledge Discovery, 2012
Social networks need to manage and control the drift of insane amount of information by filtering... more Social networks need to manage and control the drift of insane amount of information by filtering and ranking everything in order to ensure they are right there for users' viewing pleasure. However, the realization of social networks ranking is currently dictated by fairly straightforward optimization algorithms. Hence, there is a need for a newly enhanced and improved ranking algorithm to be formulated since users have been occasionally seeing what they should not. A composition of a generic score and a collective score that would equate to a whopping new-fangled algorithm called E.L.I.T.E. which comprises of five essential elements-Engagement-U, Lifetime, Impression, Timeframe and Engagement-O in ensuring a more accurate result for users to see more of what they care about, less of what they do not and more of who they are interested in, less of who they are not. Engagement-U is the affinity between users measured by the relationships and other related interests between them, Lifetime is a trace of users' past based on their positive, neutral and even negative interactions and actions with other users, Impression is the weight of each object determined by the number of positive responses from users, Timeframe is the timeline scoring technique in which an object naturally loses its value as time passes and Engagement-O is the attraction of users to objects measured between objects and associated interests of users.
h i g h l i g h t s • We implement SeTM, a hybrid system which supports TM and TLS simultaneously... more h i g h l i g h t s • We implement SeTM, a hybrid system which supports TM and TLS simultaneously with the modest hardware cost. • R/W bits was used by SeTM to increase the accuracy of signature in conflict detection. • SeTM proposes a new fast-rollback mechanism to handle slow abort problem in TM and TLS. • SeTM introduces a conflict-tolerant mechanism to tolerate WAW and WAR data conflicts in TLS. • An ordering mechanism was proposed by SeTM to achieve an efficient execution of these un-order transactions in TM.
2012 International Conference on Advanced Computer Science Applications and Technologies (ACSAT), 2012
ABSTRACT With the introduction of World Wide Web, users are able to connect and interact with eac... more ABSTRACT With the introduction of World Wide Web, users are able to connect and interact with each other anywhere in this world easily. However, elderly users may not be familiar with these technologies, some are even unable to cope with the rapid changes of World Wide Web. In fact, they are slow and unfamiliar with the latest trends and technology. This paper aims at addressing the problem of the gap between elderly users and the rapid changes of Internet Technologies. We use different techniques of user centred design in order to abstract design requirement for elderly users. These techniques help to gather the necessary user requirements from the elderly users. We apply these techniques to a simple case study - Web Site Design. We develop a combination of low-fidelity prototype and high-fidelity prototype to get the different perspectives of the elderly users for establishing usability requirement.
International Journal of Hybrid Information Technology, Jul 31, 2016
Selangor is an important river basin in adjacent to the city of Kuala Lumpur, the federal capital... more Selangor is an important river basin in adjacent to the city of Kuala Lumpur, the federal capital of Malaysia and it supplies about 70% of the water required for domestic and industrial use for the city. Selangor river basin is presently regulated by two water supply dams, namely the Tinggi dam and the Selangor dam. Water is abstracted at an intake located 21 and 42 km downstream of the Tinggi and Selangor dam respectively. In the wet season, when unregulated flows downstream of the dams are sufficient for abstraction, no releases from the dams are required. However, releases are required in the dry season when flows downstream fall below the normal level. The present practice in dam operation is to use recession analysis in low flow forecasting during prolonged dry periods. Recession constants were derived using stream flow data and future flows were forecasted using the current flow and the recession constants assuming that there is no rain for the coming period where forecasts were made. Decisions were then made for releases from the dams. The disadvantage of recession analysis in forecasting low flow is that the forecast is not accurate if rain falls during the period and over release will occur. This study reports the use of Artificial Neural Network (ANN) models to forecast one and two time steps ahead river flows at the Rantau Panjang gauging station near the water supply intake for different travel times from the dams to the intake point to help in determining the regulating releases from the dams for more efficient reservoir operation. Two different ANN models, the Multi -Layer Perceptron (MLP) and the General Regression Neural Network (GRNN), were developed and their performances were compared. Endogenous and exogenous input variables such as stream flow and rainfall with various lags were used and compared for their ability to make future flow predictions. The input variables required are decided considering statistical properties of the recorded rainfall and flow such as cross-correlation between flow and rainfall, auto and partial autocorrelation of the flows which are best in representing the catchment response. Results show that both methods perform well in terms of R² but GRNN models generally give lower RME and MAE values indicating their superiority compared to MLP models.
The Journal of The Institution of Engineers, Malaysia, 2016
Flood estimations based on itting the frequency of occurrence of annual peak discharges using the... more Flood estimations based on itting the frequency of occurrence of annual peak discharges using the Log-Pearson Type 3 distribution are commonly used but they are sensitive to the skew coeficients of the gauging stations. The estimation accuracy can be improved by using a weighted average population skew coeficient calculated from the sample station skew and the generalised unbiased skew. The U.S. Water Resources Council (WRC) has documented guidelines for estimating the generalised skew coeficients and published a map of generalised skew values for the United States. The map shows isolines of skew coeficient values and the average skew coeficient for each 1-degree quadrangle of latitude and longitude for the United States. Following the WRC guidelines, many of the state authorities in the US have developed the generalised skew coeficients separately on a state/regional basis. In Malaysia, the Log Pearson Type 3 distribution has been widely used for lood peakestimation but there are n...
International Journal of Hybrid Information Technology, 2016
This study evaluates the use of Multi-Layer Perceptron (MLP) neural network models to forecast wa... more This study evaluates the use of Multi-Layer Perceptron (MLP) neural network models to forecast water levels of a gauging station located at the Kuala Lumpur city centre in Malaysia using records of upstream multiple stations. Cross correlation analysis of water level data was performed to determine the input vectors which include the current and antecedent water level data of the upstream stations to ensure that of the data available, the most influential values corresponding to different lags are selected for analysis. Twelve well recorded storm events were used to train, test and validate the MLP models. The best performance based on MSE, MAE and R² was achieved with a model of 15 input vectors of upstream current and antecedent water levels, 7 hidden nodes and an output vector for the station at Kuala Lumpur centre. The R² values for training, testing and validation datasets are 0.81,0.85 and 0.85 respectively.
The Journal of The Institution of Engineers, Malaysia, 2018
Temporal and spatial variations of a flood hydrograph moving through a river reach can be simulat... more Temporal and spatial variations of a flood hydrograph moving through a river reach can be simulated using flood routing tools such as hydrodynamic, hydrological and the ANN (Artificial Neural Networks) models. The ANN models have emerged as viable tools in flood routing and are widely adopted for this purpose. The aim of this study is to make an objective comparison of these two flood routing models to evaluate their individual performance. Four flood events recorded for Klang river at Kuala Lumpur in the period October 1973 to December 1974 for stations at Leboh Pasar and Sulaiman Bridge which are 950m apart were used for this study. The statistical performance of the models is assessed using criteria such as peak flow, root mean square error, mean absolute error and Nash –sutcliffe coefficient. Results from calibration runs for the 02/05/1974 flood event show that the MAE, RMSE and NAE for ANN and Muskingum models are 0.75,1.24,0.9917 and 1.1,1.3, 0.992 respectively. The performan...
International Journal of Hybrid Information Technology, 2016
Water from the Selangor river basin is the main source of supply for the federal capital of Kuala... more Water from the Selangor river basin is the main source of supply for the federal capital of Kuala Lumpur and the various districts of Selangor State located in the centre of Peninsular Malaysia since 2000. As this region is the most populated area in the country, its rapid economic development and population growth have caused concern over the adequacy of the quantity and quality of water abstracted from the Selangor basin, both at present and in the future. A recent prolonged drought has caused a one-month water rationing in the Kuala Lumpur and adjacent areas which basically has seriously affected the everyday life of the people and the industrial and agricultural sectors. A rainfall drought analysis is therefore required for assessing the severity of the drought events in the study area. In this study, L-moments have been used to compute the rainfall quantile values for 9 probabilities, 6 durations, 12 starting months for the 2 regions across the Selangor basin. The choice of the Pearson Type III and Wakeby distributions for fitting the Selangor rainfall data is presented. Quantile values are expressed as a percentage of mean rainfall of the particular duration and presented as drought indication maps. Rainfall of specific return period can be calculated easily using these maps and the mean rainfall for the particular station. The return period (severity) of a historical rainfall event from the station can be known when the magnitude of the historical event is compared with that of the rainfall event of a particular return period. This helps in monitoring and management of droughts.
The Journal of The Institution of Engineers, Malaysia, 2018
Rainfall temporal patterns are needed as inputs for hydrologic models such as unit hydrograph or ... more Rainfall temporal patterns are needed as inputs for hydrologic models such as unit hydrograph or runoff routing method used in the derivation of flood hydrographs. The patterns adopted can have a major effect on the resulting flood computed. Short and long duration rainfall data are both required for different sizes of catchments to determine and locate the flood producing critical storms in flood estimation. Design temporal patterns with different durations are therefore also required for distributing the storm rainfall in flood calculations. Patterns for a large number of durations with reasonably short time intervals are needed by designers to reduce the need for interpolation and to maintain the accuracy in obtaining the peak of the hydrograph. In this study, pluviograph data for the Upper Klang Catchment with records of over 30 years are used to derive temporal patterns for 20 standard durations as per ARR87. Rainfall temporal patterns for the upper Klang were derived for rainf...
The Journal of The Institution of Engineers, Malaysia, 2018
In this Study, the Muskingum and Lag river routing models were used to estimate the routing coeff... more In this Study, the Muskingum and Lag river routing models were used to estimate the routing coefficients of Klang river using flood records of gauging stations at Leboh Pasar and Sulaiman Bridge, for a river reach of 900m between the two gauging stations. Recorded inflow hydrographs at Leboh Pasar and outflow hydrographs at Suilaman Bridge were used to calibrate and validate the Muskingum and Lag routing models. Best fit routing parameters for the models were derived using the optimization module of HEC-HMS, and the average parameters obtained were used to validate the routing models using different sets of flood events. The outflow hydrographs at Sulaiman Bridge derived from routing models were used to compare with the observed hydrographs by plotting the hydrographs for visual inspections. The ability to produce the observed flow at the Sulaiman Bridge by the routing models were also assessed statistically by calculating the goodness of fit indices from the routed and observed hyd...
A set of quality control streamflow data is always required in the planning, design and managemen... more A set of quality control streamflow data is always required in the planning, design and management of water resources projects. Although every effort has been made by the authority in the collection of complete and continuous hydrological data such as rainfall and streamflow, gaps and incomplete data sets with inadequate length are always encountered, as is always the case. These can be due to faulty field instruments, the occurrence of natural disasters and other reasons. Over the years, various techniques have been developed to infill the missing data, especially the streamflow data. These techniques include regression analysis ,rainfall runoff modelling and the use of artificial neural networks(ANN) data driven models. In this study, the HEC-HMS model is used to simulate long term daily streamflow of Sg Melaka. The process involved using recorded flow and rainfall data of 1989-1992 to calibrate the model and the model validation using records of 1985-1986. Results show that the m...
Data from 4 pluviograph stations in the upper Klang basin were used to derive time distributions ... more Data from 4 pluviograph stations in the upper Klang basin were used to derive time distributions of heavy rainfall using the NOAA method. Rainfall cases for the temporal distribution analysis were selected from the annual maximum series usually used in the rainfall frequency analysis. Each case (i.e., maxima) was the total accumulation over a selected duration (1,6 ,12, 24 hour for this study). For each rainfall case, cumulative rainfall amounts were converted into percentages of the total rainfall amount at specified time increments. All cases for a specific duration were then combined and these rainfall cases were analysed separately, determining the percentage accumulated to 10, 20,30,40,50,60,70,80,90 and 100% of its total duration. For each duration, the percentage was determined by a percentage series of total rainfall, and the probabilities calculated. In order to obtain the values of rainfall based on above definition, linear interpolations were carried out between the probabilities and the immediately previous and subsequent probabilities. The temporal distribution curves for nine deciles (10% to 90%) were plotted in the same graph. Results show that first-quartile and second-quartile storms occurred most frequently with durations less than or equal to 12 hours; and first-quartile and fourth quartile storms most often had durations of 24 hours. Following the principles of Huff (1990), the temporal distribution curves derived in this study are recommended to be used for normal design as follows: • For 1 hour duration, it is recommended that second quartile relations be used to establish typical time distributions. • Time distributions for storms lasting 6 hours ,12 hours and 24 hours are most likely to conform to a first-quartile distribution. For most purposes, the median curves are probably most applicable to design. These curves are more firmly established than the more extreme curves, such as those for the 10% and 90% probability levels, which are determined from a relatively small portion of each quartile's sample. However, the extreme curves should be useful when runoff estimates are needed for the occurrence of unusual storm conditions, such as typified by the 10% curves.
2012 9th International Conference on Fuzzy Systems and Knowledge Discovery, 2012
Social networks need to manage and control the drift of insane amount of information by filtering... more Social networks need to manage and control the drift of insane amount of information by filtering and ranking everything in order to ensure they are right there for users' viewing pleasure. However, the realization of social networks ranking is currently dictated by fairly straightforward optimization algorithms. Hence, there is a need for a newly enhanced and improved ranking algorithm to be formulated since users have been occasionally seeing what they should not. A composition of a generic score and a collective score that would equate to a whopping new-fangled algorithm called E.L.I.T.E. which comprises of five essential elements-Engagement-U, Lifetime, Impression, Timeframe and Engagement-O in ensuring a more accurate result for users to see more of what they care about, less of what they do not and more of who they are interested in, less of who they are not. Engagement-U is the affinity between users measured by the relationships and other related interests between them, Lifetime is a trace of users' past based on their positive, neutral and even negative interactions and actions with other users, Impression is the weight of each object determined by the number of positive responses from users, Timeframe is the timeline scoring technique in which an object naturally loses its value as time passes and Engagement-O is the attraction of users to objects measured between objects and associated interests of users.
h i g h l i g h t s • We implement SeTM, a hybrid system which supports TM and TLS simultaneously... more h i g h l i g h t s • We implement SeTM, a hybrid system which supports TM and TLS simultaneously with the modest hardware cost. • R/W bits was used by SeTM to increase the accuracy of signature in conflict detection. • SeTM proposes a new fast-rollback mechanism to handle slow abort problem in TM and TLS. • SeTM introduces a conflict-tolerant mechanism to tolerate WAW and WAR data conflicts in TLS. • An ordering mechanism was proposed by SeTM to achieve an efficient execution of these un-order transactions in TM.
2012 International Conference on Advanced Computer Science Applications and Technologies (ACSAT), 2012
ABSTRACT With the introduction of World Wide Web, users are able to connect and interact with eac... more ABSTRACT With the introduction of World Wide Web, users are able to connect and interact with each other anywhere in this world easily. However, elderly users may not be familiar with these technologies, some are even unable to cope with the rapid changes of World Wide Web. In fact, they are slow and unfamiliar with the latest trends and technology. This paper aims at addressing the problem of the gap between elderly users and the rapid changes of Internet Technologies. We use different techniques of user centred design in order to abstract design requirement for elderly users. These techniques help to gather the necessary user requirements from the elderly users. We apply these techniques to a simple case study - Web Site Design. We develop a combination of low-fidelity prototype and high-fidelity prototype to get the different perspectives of the elderly users for establishing usability requirement.
Uploads
Papers by Hong Kee An