The increase of anthropogenic activities and growth of technology in Antarctica is fuelled by the high demand for petroleum hydrocarbons needed for daily activities. Oil and fuel spills that occur during explorations have caused hydrocarbon pollution in this region, prompting concern for the environment by polar communities and the larger world community. Crude oil and petroleum hydrocarbon products contain a wide variety of lethal components with high toxicity and low biodegradability. Hydrocarbon persistence in the Antarctic environment only worsens the issues stemming from environmental pollution as they can be long-term. Numerous efforts to lower the contamination level caused by these pollutants have been conducted mainly in bioremediation, an economical and degrading-wise method. Bioremediation mainly functions on conversion of complex toxic compounds to simpler organic compounds due to the consumption of hydrocarbons by microorganisms as their energy source. This review presents a summary of the collective understanding on bioremediation of petroleum hydrocarbons by microorganisms indigenous to the Antarctic region from past decades to current knowledge.
The control of cost and time in construction projects is one of the most important issues in construction since the emergence of the construction industry. A successful project should meet not only quality output standards, but also time and budget objectives. The management and control of cost and time in construction is fundamental in every project. An effective cost and time management and control technique for construction projects is important in managing risk of cost overrun and delay in completion of projects. Construction projects are becoming more complex as they now involve many stakeholders from different disciplines. The emergence of Building Information Model (BIM), an alternative technology is believed to solve issues related to project cost and time control as it efficiently increases collaboration between stakeholders. The aim of this paper is to review and summarise the causes of delay and cost overrun in construction industries, which are the main causes of disputes and abandonment of projects in the industry. It was found that delays and cost overrun eat deep into the industry and leave the construction industry with a bad image for decades even with rapid advancement in technology. The review of the applications of BIM showed that most of the applications are geared towards minimising construction cost and time spent on projects. This means that the use of BIM in the management of construction projects has great impact on project cost and time.
BIM, cost control, delay and cost overrun, time control
Android devices have gained a lot of attention in the last few decades due to several reasons including ease of use, effectiveness, availability and games, among others. To take advantage of Android devices, mobile users have begun installing an increasingly substantial number of Android applications on their devices. Rapid growth in many Android devices and applications has led to security and privacy issues. It has, for instance, opened the way for malicious applications to be installed on the Android devices while downloading different applications for different purposes. This has caused malicious applications to execute illegal operations on the devices that result in malfunction outputs. Android botnets are one of these malfunctions. This paper presents Android botnets in various aspects including their security, architecture, infection vectors and techniques. This paper also evaluates Android botnets by categorising them according to behaviour. Furthermore, it investigates the Android botnets with respect to Android device threats. Finally, we investigate different Android botnet detection techniques in depth with respect to the existing solutions deployed to mitigate Android botnets.
Android botnets, malware, detection techniques, DDoS attacks, mobile security
Diabetes is one of the major life-threatening health problems worldwide today. It is one of the most fast-growing diseases that cause many health complications and a leading cause of decreasing life expectancy and high mortality rate. Many studies have suggested several different types of intervention to treat Type 1 diabetes such as insulin therapy, islet transplantation, islet xenotransplantation and stem cell therapy. However, issues regarding the efficacy, cost and safety of these treatments are not always well addressed. For decades, diabetes treatments with few side effects and long-lasting insulin independence has remained one of the most challenging tasks facing scientists. Among the treatments mentioned above, application of human islet transplantation in patients with type 1 diabetes has progressed rapidly with significant achievement. Again, the lack of appropriate donors for islet transplantation and its high cost have led researchers to look for other alternatives. In this review, we discuss very pertinent issues that are related to diabetes treatments, their availability, advantages, disadvantages and also cost.
This paper presents an optical double-ring resonator system of which the design and analytical model are demonstrated to be useful as a novel force in micro-Newton measurement-sensing devices based on optical sensors. The sensing application can be accomplished by changing the optical filtering characteristic of an optical resonance structure such as the ring resonator system. Together with the concept of stress/strain and the elastic modulus of the waveguide material, the relationship between a slightly different value in the exerted force acting on the sensing unit and a difference in the waveguide length can be evaluated. Indeed, changing the optical path length (the waveguide length) causes the difference in peak spectrum of the filtering signals obtained from a ring resonator system. Hence, by measuring the spacing shift between the sensing and setting peak signal in the considered channel, the measurement of a slightly different value in the exerted forces on the sensing unit can be achieved. From the simulation results, an exerted force in small-scale ranges from 10 µN to 50 µN have been evaluated by measuring a spacing shift between the peak signals ranging from 35 pm to 225 pm. In this study, the potential of using such a double-ring resonator device for a force in micro-Newton sensing application is studied and discussed.
Force sensing, optical sensors, optical resonance, optical filter, ring resonator
This study investigated distributions, composition patterns, sources and potential toxicity of polycyclic aromatic hydrocarbon (PAHs) pollution in surface sediments from the Kim Kim River and Segget River, Peninsular Malaysia. The samples were extracted using Soxhlet extraction, purified using two-step silica gel column chromatography and then analysed by gas chromatography mass spectrometry (GC-MS). The total PAH concentrations ranged from 95.17 to 361.24 ng g-1 dry weight (dw) and 330.09 to 552.76 ng g-1 dw in surface sediments from the Kim Kim and Segget Rivers, respectively. Source type identification using PAH molecular indices and hierarchical cluster analysis (HCA) indicated that PAHs were mostly of pyrogenic origin, while in some stations petrogenic sources had a significant portion. A PAH toxicity assessment using sediment quality guidelines (SQGs), mean effect range-median quotient (M-ERM-Q), benzo[a]pyrene (BaP) equivalent concentration and BaP toxicity equivalent quotient (TEQcarc) indicated low probability of toxicity for both the Kim Kim and Segget Rivers. Moreover, the human health risk assessment applying Cancer Riskingestion and Cancer Riskdermal indicated that probabilistic health risk to humans via ingestion and dermal pathways from sediments of the Kim Kim and Segget Rivers can be categorised as low-to-moderate risk.
Kim Kim River, Segget River, Malaysia, pollution sources, Polycyclic Aromatic Hydrocarbons (PAHs), sediment, ecological risk assessment, human health risk assessment
Ulcers in the gastrointestinal tract refer to any appreciable depth of break in the mucosa lining that may involve submucosa. Common types of ulcer include peptic, gastric and duodenal ulcer, which may lead to chronic inflammation. Ulcers may be caused by excessive alcohol intake or prolonged use of non-steroidal anti-inflammatory drugs (NSAID), in addition to several other factors. Conventional medication such as Omeprazole (proton pump inhibitor) and Ranitidine (H2 blockers) for management of ulcers may cause severe side effects such as myelosupression and abnormal heart rhythm. This has driven researchers to explore the potential of natural products for management of ulcers with reduced side effects. Kelulut honey (KH) is a type of honey that is produced by stingless bees from the Trigona species. It is believed to have a lot of medicinal properties such as being antimicrobial, antioxidant and antidiabetic. Yet, no scientific study has been carried out on its antiulcer properties. This study was carried out to determine the antiulcer properties of KH. Eighteen male Sprague dawley rats (5 to 6 weeks old, weighing between 200 and 300 g) were divided into three groups (n=6). The groups were 1) normal control group (without ulcer, without KH), 2) positive control group (with ulcer, without KH) and 3) treatment group (with ulcer, treated with KH). The treatment, KH (1183 mg/kg), was given twice daily for 30 consecutive days by oral administration. On Day 31, the rats were induced with absolute ethanol (5 mL/kg) via oral administration after being fasted for 24 h and were sacrificed 15 min after the induction. The stomach was collected for macroscopic and histopathological evaluation. Pretreatment with KH significantly reduced (p<0.05) both the total area of ulcer and the ulcer index compared to the positive control group. The percentage of ulcer inhibition in the KH pre-treated group was 65.56% compared with the positive control group. The treatment, KH, exhibited antiulcer properties against ethanol-induced gastric ulcer.
Biometric authentication refers to the use of measurable characteristics (or features) of the human body to provide secure, reliable and convenient access to a computer system or physical environment. These features (physiological or behavioural) are unique to individual subjects because they are usually obtained directly from their owner's body. Multibiometric authentication systems use a combination of two or more biometric modalities to provide improved performance accuracy without offering adequate protection against security and privacy attacks. This paper proposes a multibiometric matrix transformation based technique, which protects users of multibiometric systems from security and privacy attacks. The results of security and privacy analyses show that the approach provides high-level template security and user privacy compared to previous one-way transformation techniques.
The main focus of this study was to obtain the optimum alkaline treatment for banana fibre and the its effect on the mechanical and chemical properties of banana fibre, its surface topography, its heat resistivity, as well as its interfacial bonding with epoxy matrix. Banana fibre was treated with sodium hydroxide (NaOH) under various treatment conditions. The treated fibres were characterised using FTIR spectroscopy. The morphology of a single fibre observed under a Digital Image Analyser indicated slight reduction in fibre diameter with increasing NaOH concentration. The Scanning Electron Microscope (SEM) results showed the deteriorating effect of alkali, which can be seen from the removal of impurities and increment in surface roughness. The mechanical analysis indicates that 6% NaOH treatment with a two-hour immersion time gave the highest tensile strength. The adhesion between single fibre and epoxy resin was analysed through the micro-droplet test. It was found that 6% NaOH treatment with a two-hour immersion yielded the highest interfacial shear stress of 3.96 MPa. The TGA analysis implies that alkaline treatment improved the thermal and heat resistivity of the fibre.
Phenol Formaldehyde (PF) resin has been extensively used in the manufacturing industry as a binding agent, especially in the production of wood-based panels because of its ability to provide good moisture resistance, exterior strength and durability as well as excellent temperature stability. However, due to the use of limited petroleum-based phenol in its formulation, there is a strong interest in exploring renewable biomass material to partially substitute the petroleum-based phenol. In this study, the slow pyrolysis of biomass decomposition process was used to convert two types of biomass, namely, oil palm frond and Rhizophora hardwood, into bio-oil. The phenol-rich fraction of the bio-oil was separated and added into the formulation of PF resin to produce an environmentally-friendly type of PF resin, known as bio-oil-phenol-formaldehyde (BPF) resin. This BPF resin was observed to have comparable viscosity, better alkalinity, improved non-volatile content and faster curing temperature than conventional PF resin. Moreover, the particleboard bonded with this BPF resin was observed to have just as excellent bonding strength as the one bonded using conventional PF resin. However, the BPF resin exhibited an increased level of free formaldehyde and less thermal stability than the conventional PF resin, probably due to the addition of the less reactive bio-oil.
Fibre-rich manure derived from grass-fed cattle showed significantly higher intrinsic sorption efficiency on Cr(VI) solution as compared to corncob, sawdust and cogon grass. This observation could be attributed to the ligneous nature and rough surface morphology of the cattle manure. Four-factor, three-level, face-centred composite design (FCCD) suggested the process was greatly affected by initial pH of the solution, contact time and sorbent dosage (p<0.0001), while stirring rate had negligible effect. Highest percentage removal (=70%) happened at pH 1-1.56, 0.79-1 g sorbent and 57-300 min contact time in 200 mg/L Cr(VI) solution. The process is spontaneous, endothermic and best described by pseudo-second-order and Langmuir model. It was found that adsorbed Cr(VI) could be recovered and CM could be reused at least three times with >50% adsorption efficiency. It is predicted that both physisorption and chemisorption are involved in the sorption process.
Biosorption, cattle manure, chromium (VI), heavy metals, response surface methodology
Amusement parks have been growing rapidly in Indonesia in the past five years but they seem affordable only for the middle and higher classes of society. Surabaya in East Java has several amusement parks that cater for low budgets and they have good-quality equipment and are safe to use. The purpose of this research is to show the results of building a low-cost amusement park. The amusement park was built by a player in the amusement park industry in collaboration with the Computer Engineering Department of Bina Nusantara University. An infinity mirror room (IMR) was built in one of the new amusement parks in Surabaya called Suroboyo Night Carnival. The full design of the IMR is discussed in this paper including the equipment used in the design. The IMR mainly uses fibre optics, LED and a mirror. The amusement park began operation on July 28, 2013, and the amount spent on one room was around US$8,500. No safety breaches have been reported. The facility has been able to attract 500 visitors on average from 2013 to 2016. The breakeven point of this facility was achieved in the first year of operation.
Amusement park, infinity mirror room, fibre optics, low-cost, Surabaya
The aim of this study was to evaluate the value of MRI spectroscopy and association with the altered glucose metabolism on 18-FDG PET/CT in patients with suspicious breast cancer. Eight selected breast cancer patients with BIRADS 4 or 5 on mammogram were recruited and patients underwent 18F-FDG PET/CT MRI (spectroscopy). The standardise uptake value (SUVmax) was analysed to determine the degree of the altered glucose metabolism on the PET. The metabolites of tumor lesions were measured using in vivo proton MR spectroscopy (MRS) of the breast. There were eight females with a mean age of 55.3±12.2 years with a biopsy result of invasive ductal carcinoma (2), lobular carcinoma (1) and benign lesion (5). There was a significant difference between the mean of the malignant tumour (SUVmax 4.28±3.74 g/ml) and the mean of the benign tumour (SUVmax: 2.33±0.9 g/ml). On the per-lesional basis of the MRS correlate with SUVmax, the suspicious breast tissue exhibited raised creatinine metabolites (mean: 3.39±0.54u) with significant correlation SUVmax mean 3.06±2.34 as compared to N-acetyl Aspartate (NAA), (mean: 2.84±0.99u) and choline (mean: 2.46± 0.70 u). This study showed that high SUVmax was associated with malignant cancer and the high creatinine metabolite that correlated with the SUVmax could potentially be utilised as a surrogate marker in detecting breast cancer.
The aims of this research are to study vertically integrated moisture flux convergence (VIMC) over Southeast Asia and to analyse its relationship to rainfall over Thailand during the period 1999 to 2013. Data reanalysed by the National Oceanic and Atmospheric Administration (NOAA) during the period 1999 to 2013 are used in this study. The monthly mean rainfall data are taken from the Global Precipitation Climatology Project (GPCP). Vertically integrated moisture transport (VIMT) is calculated by vertically integrating moisture fluxes of the u and v components. The finite difference method is applied to the vertically integrated moisture flux divergence (VIMD). The results show that VIMD over the Indian Ocean is strong, and the moisture is directed from the Indian Ocean to Thailand by southwest winds that cause strong moisture convergence over Thailand during the rainy season, while moisture in the summer season is a strong divergence. Moisture increases from the South China Sea to Thailand during October to December, causing more moisture convergence over northern and northeastern Thailand. That the relationship between rainfall and VIMC averaged over Thailand from the years 1999 to 2013 is confirmed by large positive correlations. The average from the years 1999 to 2013 over the study area is confirmed by Thailand's rainfall pattern.
The issue of classifying objects into groups when measured variables in an experiment are mixed has attracted the attention of statisticians. The Smoothed Location Model (SLM) appears to be a popular classification method to handle data containing both continuous and binary variables simultaneously. However, SLM is infeasible for a large number of binary variables due to the occurrence of numerous empty cells. Therefore, this study aims to construct new SLMs by integrating SLM with two variable extraction techniques, Principal Component Analysis (PCA) and two types of Multiple Correspondence Analysis (MCA) in order to reduce the large number of mixed variables, primarily the binary ones. The performance of the newly constructed models, namely the SLM+PCA+Indicator MCA and SLM+PCA+Burt MCA are examined based on misclassification rate. Results from simulation studies for a sample size of n=60 show that the SLM+PCA+Indicator MCA model provides perfect classification when the sizes of binary variables (b) are 5 and 10. For b=20, the SLM+PCA+Indicator MCA model produces misclassification rates of 0.3833, 0.6667 and 0.3221 for n=60, n=120 and n=180, respectively. Meanwhile, the SLM+PCA+Burt MCA model provides a perfect classification when the sizes of the binary variables are 5, 10, 15 and 20 and yields a small misclassification rate as 0.0167 when b=25. Investigations into real dataset demonstrate that both of the newly constructed models yield low misclassification rates with 0.3066 and 0.2336 respectively, in which the SLM+PCA+Burt MCA model performed the best among all the classification methods compared. The findings reveal that the two new models of SLM integrated with two variable extraction techniques can be good alternative methods for classification purposes in handling mixed variable problems, mainly when dealing with large binary variables.
Classification, large mixed variables, multiple correspondence analysis, Principal Component Analysis (PCA), Smoothed Location Model (SLM)
Research on natural fibres has been carried out from past decades as a result of developing low cost, eco-friendly materials. The objective of this study is to fabricate composites utilising sawdust of various proportion and compare the mechanical properties like tensile, flexural and hardness within dry condition and with respect to specimens immersed in distilled and salt water. The composites with constant reinforcement 15%, different percentage of matrix (85, 80, 75, 70%) and filler (0, 5, 10, 15%), respectively by mass, are developed by hand layup method and is compared for their mechanical properties. Mechanical properties of composite fabricated from sawdust up to certain percentage showed an improvement when compared to composite with no filler; with further increase in filler, a drop in mechanical performance is noticed. An increase in tensile strength by 12.75%, flexural strength by 5.94%, hardness by 18.34%, tensile modulus by 100.3%, flexural modulus by 60.4% is observed in dry condition compared to the composite with no filler. Mechanical degradation in tensile strength, tensile modulus, hardness, flexural strength for the samples subjected to ageing in sea and distilled water is observed. Flexural modulus after ageing increased with filler addition up to a certain percentage and with further increase in filler, a decrease is noticed. Higher mechanical degradation (except flexural modulus) is observed for those specimens immersed in sea water.
Ageing, composite, epoxy, mechanical properties, sawdust, short coir fibre
Urbanisation increases the level of imperviousness in a catchment, and more runoff is converted from rainfall in urban areas. To mitigate this adverse situation, dispensed green infrastructure presents the best solution for delivering results in reducing stormwater impact. Green roofs and rain gardens are extensively studied and widely available in the literature, but this is not the case for green walls, which more often than not, are treated as ornaments. Thus, this study developed a computer-aided stormwater model that incorporates a green wall to investigate its effectiveness as an urban drainage system. The effectiveness of employing a green wall as a stormwater component is tested using USEPA SWMM 5.1 and the embedded bioretention cell interface. Four simulation models according to different conditions and precipitation input are tested, compared and discussed. The conditions include investigation of different soil types, average recurrence interval (ARI) and storm duration with design and observed rainfall. The results reveal that synthesis precipitation data, used in Scenario 1, 2 and 3, decreased runoff by more than half, at 55% on condition of one-year ARI and 5 minutes of storm duration. Meanwhile, Scenario 4 also shows a repetition of runoff reduction by half after the integration of the green wall using the observed rainfall data. Thus, it is verified that a green wall can be effectively used as an urban drainage system in reducing surface runoff.
Bioretention, green wall, runoff, SWMM, urban stormwater management
Information on situation of air pollution is critically needed as input in four disciplines of research including risk management, risk evaluation, environmental epidemiology, as well as for status and trend analysis. Two normal practices were identified to evaluate daily air pollution situation; first, pollution magnitude has been treated as the common indicator, and second, the analysis was often conducted based on hourly average data. However, the information on the magnitude level alone to represent the pollution condition based on a rigid point data such as the average was seen as insufficient. Thus, to fill the gap, this study was conducted based on continuously measured data in the form of curves, which is also known as functional data, whereby pollution duration is emphasised. A statistical method based on curve ranking was used in the investigation. The application of the method at Klang, Petaling Jaya and Shah Alam air quality monitoring stations located in the Klang Valley, Malaysia, has shown that pollution duration decreases as the magnitude increases. Shah Alam has the longest pollution duration at low and medium magnitude levels. Meanwhile, all the three stations experienced quite a similar length of average pollution duration for the high magnitude level, that is, about 2.5 days. It was also shown that the occurrence of PM10 pollution at the area is significantly not random.
Air pollution, functional data analysis, PM10, curve ranking, Malaysia
This paper presents the application of active contours region-based method of image segmentation to Computed Tomography (CT) images. Previous researchers applied this region based method on Magnetic Resonance Image (MRI), in vivo images and synthetic images which contain intensity inhomogeneities. In this paper, a different modality known as Computed Tomography (CT) scan was applied. CT scan also produces images containing intensity inhomogeneity, and it is predicted that this method provide good segmentation results. The main objective of applying this method is to check its applicability on CT images. The segmentation process begins by finding the area of interest (black region). Results from this experiment are then used in estimating time of death. Experimental results show that this method has successfully segmented the black region when some parameters changed, provided that the regions are closed to each other. If the black regions are located far from each other, then this method will only segment certain areas.
Local Gaussian distribution, computed tomography images, segmentation
One of the ways to calculate dividend for an investment is by using average lowest balance (ALB) concept. The existing calculation of dividend based on ALB concept can only be done yearly. This paper discusses on the development of a general formula to calculate the accumulated amount for any period of time, based on the ALB concept that considers different yearly dividend rates. The patterns for each variable and coefficient for the calculated yearly accumulated amount were analysed. The general forms of each variable and coefficient were then combined to form the general formula for calculating the accumulated amount. Validity of the general formula is confirmed by calculating the percentage errors and proven by using mathematical induction.
Investment, average lowest balance, pattern analysing, general formula, accumulated amount
Feature selection has been widely applied in many areas such as classification of spam emails, cancer cells, fraudulent claims, credit risk, text categorisation and DNA microarray analysis. Classification involves building predictive models to predict the target variable based on several input variables (features). This study compares filter and wrapper feature selection methods to maximise the classifier accuracy. The logistic regression was used as a classifier while the performance of the feature selection methods was based on the classification accuracy, Akaike information criteria (AIC), Bayesian information criteria (BIC), Area Under Receiver operator curve (AUC), as well as sensitivity and specificity of the classifier. The simulation study involves generating data for continuous features and one binary dependent variable for different sample sizes. The filter methods used are correlation based feature selection and information gain, while the wrapper methods are sequential forward and sequential backward elimination. The simulation was carried out using R, an open-source programming language. Simulation results showed that the wrapper method (sequential forward selection and sequential backward elimination) methods were better than the filter method in selecting the correct features.
This paper is an attempt to perceive and order guns using a two-layer neural system model taking into account a feedforward backpropagation calculation. Numerical properties from the joined pictures were utilised for enhanced gun characterisation execution. Inputs of the system model were 747 pictures blackmailed from the discharging pin impression of five differing guns model, Parabellum Vector SPI 9mm. Components created from the dataset were further grouped into preparation set (523 components), testing set (112 components) and acceptance set (112 components). Under managed learning, exact results exhibited that a two-layer BPNN of 11-11-5 arrangement, with tansig/purelin exchange capacities and a "trainlm" preparing calculation, had productively delivered 87% right aftereffect of grouping. The order result serves to be progressed and contrasted with the previous works. Finally, the joined picture districts can offer some accommodating data on the grouping of gun.
Neurocomputing has been adjusted effectively in time series forecasting activities, yet the vicinity of exceptions that frequently happens in time arrangement information might contaminate the system preparing information. This is because of its capacity to naturally realise any example without earlier suspicions and loss of sweeping statement. In principle, the most widely recognised calculation for preparing the system is the backpropagation (BP) calculation, which inclines toward minimisation of standard slightest squares (OLS) estimator, particularly the mean squared mistake (MSE). Regardless, this calculation is not by any stretch of the imagination strong when the exceptions are available, and it might prompt bogus expectation of future qualities. In this paper, we exhibit another calculation which controls the firefly algorithm of least median squares (FFA-LMedS) estimator for neural system nonlinear autoregressive moving average (ANN-NARMA) model enhancement to provide betterment for the peripheral issue in time arrangement information. Moreover, execution of the solidified model in correlation with another hearty ANN-NARMA models, utilising M-estimators, Iterative LMedS and Particle Swarm Optimisation on LMedS (PSO-LMedS) with root mean squared blunder (RMSE) qualities, is highlighted in this paper. In the interim, the actual monthly information of Malaysian Aggregate, Sand and Roof Materials value was taken from January 1980 to December 2012 (base year 1980=100) with various levels of anomaly issues. It was found that the robustified ANN-NARMA model utilising FFA-LMedS delivered the best results, with the RMSE values having almost no mistakes at all in all the preparation, testing and acceptance sets for every single distinctive variable. Findings of the studies are hoped to assist the regarded powers including the PFI development tasks to overcome cost overwhelms.
ANN, time series, robust backpropagation, firefly algorithm, least median squares
PM10 has been identified as being a common problem in Malaysia and many other countries all over the world. A Markov chain probability model is found to fit the average daily PM10 concentrations data of urban station (Shah Alam) and background area station (Jerantut) in Malaysia. This study aims to identify the occurrence of polluted and non-polluted days affected by PM10 concentrations based on data for 12 years' period (2002-2013). The first order transition probability matrix of a Markov chain model and a two-state Markov chain, which are polluted days (1) and non-polluted days (0), were used for this purpose. The threshold value used in this study is referring to WHO 2006 guidelines (50µgm-3). Results of the analysis shows that there is a high probability that the next day event depends on what has happened on the previous day. The recurrence of the polluted day for Shah Alam is 4-5 days, while 2-3 days for Jerantut. By fitting the first order of Markov chain model, the results show that the higher order of Markov chain model is needed in order to get the best fitted distribution of polluted events at these two monitoring stations. Thus, the prediction of PM10 concentrations event can be made by considering the conditions of the previous day event.
Takaful, the Islamic alternative to conventional insurance, is based on the concept of social solidarity, cooperation and mutual indemnification of losses of members. The 'transparency' offered in the Takaful system will eliminates the elements of gharar (uncertainty), maisir (gambling) and riba (usury). Due to the dynamicity and complexity of cash flows in the Takaful system, the application of system dynamic approach is used in order to discover any possible internal and external impacts in the assumptions used in determining contributions rate from the participants. The traditional approach, which is the deterministic approach, has limitations where changes of the actual experience may cause operators to stop issuing the contract or product. Using system dynamic, these possible effects from the actual experience can be determined in terms of the amounts transferred to shareholder's fund and results obtained can assist the management to decide which assumptions to be used so that the operators will continue solvent and making profit at the same time. The results of System Dynamic simulation analysis in this paper represent the impacts of component changes in the Takaful model. The results can be used as decision tools for the Takaful operators to determine the best assumptions and strategies in order to maximise their profits.
Actuarial Science, System Dynamic, Insurance, Takaful
The purpose of this study was to demonstrate a facile method to produce monodispersed and size-controllable potassium silicate nanoparticles from rice straw waste. Different from other methods that use expensive raw chemicals, our method utilises rice straw waste as a source of silica that is cost-free and largely available. In the experimental procedure, rice straw waste was burned. Then, the burned rice straw waste was put into the alkaline extraction process (using potassium solution) and flame-assisted spray-pyrolysis apparatus system. To support the flame-assisted spray-pyrolysis, we utilised commercially available liquid petroleum gas as a fuel source for the flame combustion process. Experimental results showed that the prepared particles are monodispersed, spherical, and having sizes in the range of nanometer (from 20 to 80 nm). The fuel flow rate plays important role in controlling particle size in the final product. Increases in the fuel lead to the formation of larger particles. Since the present method can convert rice straw into useful and valuable potassium silica particles, further development of this study would give a positive impact for the reduction of rice straw waste emission.
Lagoons are shallow coastal bodies of water separated from the ocean by a series of barrier islands which lie parallel to the shoreline. Sediment supplied to enclosed lagoons will influence the distribution and morphodynamics of coastal lagoons. Lagoons are re-shaped by erosion and deposition by the accumulation of material washed or blown over the enclosing barriers, and by the accretion of in-flowing river sediment. The purposes of this study is to describe the methodology and attain results for identification of multitemporal Landsat image data of the Segara Anakan Lagoon for the years 1978, 1994, 2003, and 2013. The method used in this study is a visual interpretation of Landsat by using GIS and remote sensing methods to produce maps of morphodynamics of a lagoon and multitemporal of a coastline change. The overall results for the identification of morphodynamics of coastal lagoons are the decrease of areas of water bodies, increase of areas of a land accretion, and coastline changes. The results are used to examine ecological management of coastal lagoons.
Coastal Lagoon, Landsat Images, Morphodynamic, Multitemporal, Segara Anakan
Simulation of a multi-hop Wireless Sensor Network (WSN) with different topologies and analysis of its performance in terms of number of messages exchanged and energy usage was done in this study. Sensor nodes in the simulation were modelled after an Arduino hardware system equipped with compatible radio transceiver for communication. The sensor nodes were configured in two network topologies, grid and random topology, for performance comparisons. Network sizes varied between 9 nodes and 256 nodes. Simulation was stopped when the communication link between the sensor nodes and their sink node broke down. It was obtained that grid topology has better performance, especially in small network size. Moreover, when the number of nodes in the network is higher, the performance of random topology network exceeds the grid's performance. Nonetheless, the lifetime span of the sensor network does not depend on the networks size or topology, rather on the available energy in each of the sensor nodes. We also have successfully improved the energy consumption model to account for more parameters of radio transceiver used in a WSN node. The energy needed to turn on and off the radio transceiver plays a significant part in the energy consumption of the sensor node.
Energy consumption, grid network topology, multi-hop routing, random network topology, wireless sensor network
Nowadays, large datasets become main intentions of researchers in many areas. However, a challenge that still remains mainly unresolved is the lack of strategies used for analysing large time-series datasets in parallel. Therefore, this research aims to design a model of exponential smoothing working on parallel computing by using the bootstrap method. Three parts will be considered in the model: data pre-processing using the bootstrap methods, parallel exponential smoothing, and aggregation of results to be the final predicted values. To implement the processes, some packages available in the R environment such as "foreach", "forecast" and "doParallel" are utilised. R environment provides many packages for scientific computing, data analysis, time-series analysis and high performance computing. For testing and validating the proposed model and implementation, a case study in astronomy, i.e. the prediction of asteroid's orbital elements, was done. Moreover, a comparison and analysis with the results produced by algorithm of Regularized Mix Variable Symplectic 4 Yarkovsky Effect (RMVS4-YE) is also presented in this paper to provide a high level of confidence on the proposed model.
Exponential smoothing, orbital element, parallel computing, R programming language, time series analysis
To increase reception in Long Term Evolution (LTE) network inside a building, a repeater is needed. The antenna used in the repeater inside the building is usually a high gain antenna with omnidirectional radiation pattern. Meanwhile, to increase data rate in LTE, one of methods used is by using Multiple Input Multiple Output (MIMO) antenna. In this paper, the omnidirectional MIMO antenna at 1.8 GHz for LTE applications has been designed and realised. The single element of this MIMO antenna is a collinear microstrip antenna array. The design and simulation were done using 3D electromagnetic simulator software, while antenna realisation was done using FR4 microstrip with a thickness of 1.6 mm and permittivity of 4.4. The measurement results showed that this antenna has 359 MHz bandwidth in frequency range at 1.6-1.9 GHz, with a return loss less than -10 dB. The antenna gain is around 7.4 to 8.7 dBi with omnidirectional radiation pattern and mutual coupling is around -22 dB to -27 dB.