Chen, Liang
Person Preferred Name
Liang Chen
Related Works
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Description / Synopsis
Aquatic vegetation appears very often in rivers and floodplains, which significantly affects the flow structure. On the other hand, a common feature of cold regions is the presence of river ice on water surfaces. Ice cover imposes an additional boundary layer on water surface which leads to significant change in flow structure and bed deformation. It also causes a decreasing trend of velocity profile near the cover. Because of vegetation’s positive impacts on water quality, habitat, and channel stability, researchers now advocate replanting and restoring projects in rivers, especially in agricultural waterways, floodways, and emergency spillways. The expansion of vegetation in fluvial systems may worsen the flood impact since highly dense vegetation in a channel reduces its flow capacity due to the increase in flow resistance and decrease in the channel width. Therefore, an accurate and critical assessment of the vegetation density and distribution pattern through reduction of bulk velocity is crucial in sustainable restoration projects. To the author's knowledge, no studies have been conducted to investigate the impacts of both ice cover and vegetation on flow resistance and channel bed deformation. It is thus necessary to examine the connection between vegetation and ice covers thoroughly in order to guarantee successful restoration projects. Most of research projects on submerged vegetation have been done in small-scale laboratory flume and specifically under the open channel flow condition. Besides, most of reported research uses uniform sediment which is not an appropriate representative of natural river systems. In the present study, deflected and non-bending model vegetation elements arranged in both square and staggered configurations with different density in the channel bed with three different non-uniform sands under different cover conditions of water surface including open channel flow and ice-covered flow conditions were used. In order to simulate the ice cover condition, smooth and rough ice covers made of Styrofoam panels were created to investigate the impacts of ice cover roughness on channel bed deformation. To represent non-uniform sediment condition, three different bed materials with median particle size (D50) of 0.50 mm, 0.60 mm, 0.98 mm were used. Results showed that the most significant variable influencing the depth of scour holes under ice-covered flow conditions is the ratio of the ice cover roughness to the bed roughness and in open channel flow conditions, the flow Froude number is determining. In the conducted experiments, it was consistently observed that the maximum scour depths occurred at the upstream, front face of the vegetation elements. It was found that the scour holes were deeper and longer under ice-covered flow. In the presence of vegetation in the bed under ice-covered flow conditions, the velocity profiles exhibit a distinct pattern characterized by two peak values. The study revealed an inverse relationship between canopy density and the dimensions of the wake zone. As the spacing distance between deflected vegetation elements decreases, the streamwise velocity experiences significant retardation slightly below the inflection point. With a sparser vegetation canopy, the inflectional region tends to diminish or disappear. Furthermore, the study observed that the inflection point was not observed in non-bending vegetation. Additionally, velocity profiles showed more pronounced inflection points in the case of a staggered arrangement of vegetation elements compared to a square arrangement. Results of this study will provide vital information for river management, channel restoration, and rehabilitation of fluvial environments through understanding the effect of various vegetation densities, arrangement patterns and morphology, as well as the revitalization of cold-weather river ecosystems.
Origin Information
Content type
Digital Document
Description / Synopsis
One of the driving force that pushes a normal cell towards a cancerous state is led by unregulated control of its cellular division through the suppression of tumour suppressor genes and over expression of oncogenes. These cancer-driver genes have been associated in the advancement of different cancer types, though how they are represented and with which genes they are associated within the cancer’s gene expression profiles is a boon towards understanding of said cancer’s development. An novel way of modelling these gene expression profiles is to use graphlet-based network analysis, a data mining technique that allows for identification, understanding, and prediction of their functionality, emergent properties, and potential controllability. The thesis aims to identify patterns in gene co-expression networks of shared cancer census genes found in breast cancer and lung cancer using cancer gene census data provided by COSMIC, the Catalogue of Somatic Mutations in Cancer.
Origin Information
Content type
Digital Document
Description / Synopsis
Financial markets require a great deal of decision making from the investors and market makers. One metric that can help ease the process of decision making is investment risk which can be measured in two parts; systematic risk and idiosyncratic risk. Clear understanding of the volatilities in each risk component can be a powerful signal in recognizing the right assets to maximize the investment returns. In this project, we focus on the idiosyncratic volatility values and provide an easy to use source code implementation that can pave the road for future studies on the relation between the idiosyncratic volatility and the chosen asset return values. Using our implemented source code, we pre-calculated the idiosyncratic volatility values for 31,198 members of NYSE, Amex and Nasdaq markets for the trade dates occurring between January 1963 and December 2019, and release this dataset along with our implemented source code. Additionally, we consider the application of machine learning techniques in predicting the idiosyncratic volatility values using the raw trade data to explore a data extension option for the future market trade records that have not yet occurred. We offer a deep learning based regression model and compare it with traditional tree-based methods on a small subset of our per-calculated idiosyncratic volatility dataset. Our analytical results show that the performance of the deep learning techniques is much more robust in comparison to that of the traditional tree-based baselines. However, more work needs to be done to be able to use a machine learning based model to reliably predict the idiosyncratic volatility values using the raw trade records.
Origin Information
Content type
Digital Document
Description / Synopsis
Managing the waste generated after response operations is the most challenging part of an offshore oil spill. A waste estimation is required before deciding on the transportation, treatment, and disposal of each type of oil spill waste. So, firstly, this thesis developed a system dynamics model to estimate the quantity of each type of oily waste generated after oil spill response operations, considering different aspects (e.g., weather conditions, the spilled oil volume and characteristics, response time and equipment). The results of the model for an actual oil spill in BC, Canada (2016), as the case study, showed a 86% average accuracy. Sensitivity analysis of the case study illustrated that a five-hour decrease in the response arrival time could increase the oil recovery by 26%. Moreover, sensitivity analysis showed a possibility of 45% overuse of sorbents for the case study. Response surface methodology (RSM) also was conducted, and the significant interaction effects between sea temperature and response arrival time on recovered oil and between sorbent boom weight and sorbent booms usage rate on solid waste were demonstrated. In addition to the oil spill response waste (OSRW) quantity estimation model, the study developed a scenariobased decision-making framework as the second objective to provide the most monetary beneficial strategies to deal with each collected OSRW under different scenarios of impact factors (e.g., waste quantity, waste quality, location, capacity, and availability of treatment and disposal facilities). An optimization model with an objective of minimizing net costs was developed to evaluate all scenarios using hypothetical and actual data. Results were categorized to develop the decisionmaking framework. It was illustrated that oil processing is the best option for managing liquid oily waste from spilled refined oil. For liquid oily waste from crude spilled oil, the oil refinery is the best option if the quantity is above a defined limit in this study. For solid oily waste management, pyrolysis is the most appropriate destination. The optimum solutions and sensitivity analysis for the actual data of a case study validated the results.
Origin Information
Content type
Digital Document
Description / Synopsis
Numerous studies have provided insight into the challenges that investors may confront when making investments, such as allocating resources across a variety of stocks and securities. In response to these challenges, various portfolio theories have been developed. Among them, Modern Portfolio Theory (MPT), developed by Harry Markowitz, is one of the most famous. The purpose of this project is to investigate if MPT can be optimized in a way to achieve a higher return by constructing a fewer number of portfolios. We propose a two-step approach, and we compare the results with the existing theory. Rather than trying to find the optimal portfolio in one step, our two-step approach breaks the optimization process down into two steps, each of which involves a group of randomized portfolios. We find an initial optimal portfolio from the first group and then, in the second step, the final optimal portfolio will be determined from the second group of randomized portfolios which are generated based on the initial optimal portfolio from the first step. Our simulation proves that our two-step approach is more efficient and gives a higher rate of return comparing to the existing approach.
Origin Information
Content type
Digital Document
Description / Synopsis
Increasingly, artificial neural networks are explored to learn relationships among temporal sequence data for purposes of classification, prediction, and anomaly detection with the hope of exceeding the performance of more traditional machine learning algorithms. While the underlying Long Short-Term Memory or Gated Recurrent Unit networks are still the preferred choices by many researchers, such recurrent networks are sub-optimal to learn relationships within and across longer sequences. Transformer neural networks, originally designed to improve the performance of natural language processing tasks, pose an interesting alternative as their attention mechanisms are more capable of capturing context and meaning within longer sequences. Such features present opportunities to apply transformer networks also to temporal sequence data of financial asset prices. This thesis introduces an extension of the original transformer neural network which is capable of multivariate time series representation learning in a supervised learning context and attempts to train temporal sequences of financial asset prices. The prediction accuracy of the transformer extension exceeds two of the most popular recurrent neural networks used for temporal sequence data prediction. The experiments are conducted in the context of a trading algorithm that showcases the practical potential and its implications. As the model is not input data specific, opportunities to transfer enhancements to other domains exist.
Origin Information
Content type
Digital Document
Description / Synopsis
In recent years, computer-animated characters have been designed more and more vivid and lifelike, and many of them are extremely similar to real people. Due to the high degree of similarity, some classical face recognition models are mixed up in the face identification process. For example, FaceNet model matches cartoon facial images with their similar real faces. In this case, some people may try to cheat by using virtual faces when they are identified by face recognition systems. To address this problem, this paper proposes an integrated approach that utilizes Multi-task Cascaded Convolutional Networks (MTCNN) and Resnet-50 models for the classification of real and cartoon faces (or virtual faces) of an input image before the face identification task. Our experiments show that the proposed integrated approach achieves better results on face identification tasks compared to some classical face recognition models that accomplish the tasks directly.
Origin Information
Content type
Digital Document
Description / Synopsis
Stock forecasting is a very complicated task due to its noise and volatile characteristics. How to effectively eliminate the noise has attracted attention from both investors and researchers. This report presents a novel de-noise technique named Line Segment Algorithm (LSA). Compared to those signal processing methods, LSA is based on the characteristic of financial time series. First, the algorithm identified the shape patterns of the historical stock price series and labeled them as turning points and false alarms. Then, a stock trend prediction framework was built and trained with the shape patterns extracted by the algorithm. Eventually, the model could predict whether a shape pattern is turning point or not. To evaluate its performance, experiments on the real stock data were carried out in LSTM and Random Forest, respectively. The results show that LSA demonstrates its effectiveness by better accuracy on prediction. It provides a new perspective for stock trend analysis and can be applied in the actual stock investment trading as well.
Origin Information
Content type
Digital Document
Description / Synopsis
Autonomous Vehicles are the future of road transportation where they can increase safety, efficiency, and productivity. In this thesis, we address a new edge case in autonomous driving when one autonomous vehicle is approached by an emergency vehicle and needs to make the best decision. To achieve the desired behavior and learn the sequence decision process, we trained our autonomous vehicle with the help of Deep Reinforcement Learning algorithms and compared the results with rule-based algorithms. The driving environment for this study was developed by using Simulation Urban Mobility as an open-source traffic simulator. The proposed solution based on Deep Reinforcement Learning has a better performance compared to the rule-based solution as a baseline both in normal driving situations and when an emergency vehicle is approaching.
Origin Information
Content type
Digital Document
Description / Synopsis
Westerly wind bursts (WWBs), usually occurring in the tropical Pacific region, play a vital role in El Niño–Southern Oscillation (ENSO). In this study, we use a hybrid coupled model (HCM) for the tropical Pacific Ocean-atmosphere system to investigate WWBs impact on ENSO. To achieve this goal, two experiments are performed: (a) first, the standard version of the HCM is integrated for years without prescribed WWBs events; and (b) second, the WWBs are added into the HCM (HCM-WWBs). Results show that HCM-WWBs can generate not only more realistic climatology of sea surface temperature (SST) in both spatial structure and temporal amplitudes, but also better ENSO features, than the HCM. In particular, the HCM-WWBs can capture the central Pacific (CP) ENSO events, which is absent in original HCM. Furthermore, the possible physical mechanisms responsible for these improvements by WWBs are discussed.
Origin Information
Content type
Digital Document
Description / Synopsis
The effective treatment of oily sludge has been a challenging problem faced by the petroleum industry worldwide. It is a semi-solid mixture of hydrocarbons, water, metallic ions, and suspended fine solids. The recalcitrant nature of oily sludge makes the treatment a difficult and costly task. The objective of this dissertation research was to develop environmentally friendly and economically competitive techniques for oily sludge treatment. Three different approaches were developed: ionic liquid (IL)-enhanced solvent extraction, co-pyrolysis with wood waste, and value-added utilization of oily sludge as sorbent to remove lead (Pb2+) and cadmium (Cd2+) from solution. Firstly, as compared to conventional solvent extraction, the IL-enhanced solvent extraction not only improved oil recovery efficiency but also greatly reduced solvent and energy consumption as well as shortening the treatment duration even at low IL concentration. Secondly, co-pyrolysis of oily waste and hog fuel was conducted in a fixed bed reactor. Three experimental parameters (pyrolysis temperature, reaction time, and hog fuel addition) were explored to optimize both oil recovery and metal ion immobilization. The latter was tested through sequential extraction techniques with high temperature pyrolysis leading to metal ions fixed within the residues. The addition of hog fuel had a significant synergistic effect on the distribution of metal ions in the various extraction fractions resulting in lower risk index (RI) values. Thirdly, the oily sludge-derived char (OS500) obtained at 500 °C could effectively remove Pb2+ from solution with the maximum sorption capacity of 373.2 mg/g (based on a Langmuir model). Sorption of Pb2+ by the OS500 was mainly attributed to its precipitation with carbonate (CO32‒) originating in OS500. The maximum sorption capacity for Cd2+, using a Langmuir model, was 23.19 mg/g. Complexation and metal ion exchange dominated Cd2+ sorption on OS500. The Pb2+ sorption capacity dramatically decreased as the iii pyrolysis temperature increased from 500 to 900 °C due to the decomposition of minerals which could release CO32‒ at high temperature. The activated OS500 showed a higher sorption capacity (90.06 mg/g) for Cd2+ than OS500 (23.95 mg/g) because conversion of barite (BaSO4) to witherite (BaCO3) after chemical activation favored the precipitation of Cd-carbonate.
Origin Information
Content type
Digital Document
Description / Synopsis
Using a Construal Level Theory (CLT) foundation, the authors conduct four studies which find consumers are more likely to pay attention to short-term (long-term) benefits if an event is taking place in the near (distant) future. Additionally, when people are deciding for themselves (acquaintances), they’re more likely to pay attention to short-term (long-term) benefits and proximal (distant) spatial locations. This research provides theoretical and managerial implications, as businesses can tailor marketing campaigns to emphasize short-term/long-term attribute dimensions to prime consumers to choose a certain alternative depending on how psychologically distant they are from an event/object. The research methods used were questionnaires where participants chose between two alternatives. The current research aims to uphold philosophy from previous literature that states: a primary aim of consumer research is to understand aspects that are influencing different trade-offs of a choice set in the preference construction process (Bettman, Luce, & Payne, 1998).
Origin Information
Content type
Digital Document
Description / Synopsis
Local scour around piers and abutments is one of the main causes of the collapse of many bridges constructed inside rivers. Many researchers have conducted various studies to predict the maximum depth of a scour hole around bridge piers and abutments. However, most of them have been done in small-scale laboratory flumes and specifically for the open channel condition. Besides, most of the existing research on bridge piers uses uniform sediment which is not an appropriate representative of natural river systems. This can result in excessively conservative design values for scour in low risk or non-critical hydrologic conditions. The most severe cases of bridge pier scouring occur in cold regions when the surface of water turns into ice in which, an additional boundary layer is being added to the water surface, which leads to significant changes in the flow field and scour pattern around bridge piers. Ice cover also causes the maximum flow velocity to move closer to the channel bed.
Origin Information
Content type
Digital Document
Description / Synopsis
Snow plays an important role on the hydrological cycle of watersheds in cold regions. Predicting timing and magnitude of snow accumulation and ablation is necessary for water management in different sectors. A spatially distributed snow model (SnowModel) is chosen for our research, which is forced by meteorological data provided from automated weather stations. SnowModel is evaluated for two watersheds in southeast of BC. Two consecutive year (2006-2008) are selected for the calibration and validation processes. Simulated snow depth and snow water equivalent (SWE) are compared with observed data from snow pillows. Two error factors of Nash-Sutcliffe Efficiency Index, and R-squared show 0.96, 0.98 values in accumulation period and 0.87, 0.86 for ablation period, respectively. Spatially distribution of snow depth and SWE over domains also are discussed. In general, SnowModel is able to estimate the accumulated snow depth and SWE in alpine areas in a high level of accuracy.
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information
Content type
Digital Document
Origin Information