The proposed project aims to develop a systematic approach for improving deep-learning-based computer vision systems by augmenting the local pixel data with the global shape data (more specifically, Jordan curves) and by adjusting system architectures to accommodate the augmented input. Three canonical computer vision problems will be investigated in this project. They are respectively image dehazing, alpha-matting, and face detection. The potential roles of Jordan curves in these applications will be examined.
The project aims to develop a novel deep learning based computer vision system to identify different categories and sub-categories of the furniture and the associated attributes (such as color, shape, style, and material). It will also develop an automated recommendation system that can learn from the massive historical data and the on-going stream of data to adaptively adjust the parameter combination for each product to maximize the chance of winning the competition against other companies.
As for any industry, the companies are constantly trying to improve their products. This can be done directly on the raw materials or on the production lines. But profitability can also be improved by reducing wastes while increasing production rates. So this project will investigate the processing conditions and adhesion formulations in terms of mechanical, thermal, chemical and adhesion properties. This will be done by optimization of both processing condition and concentration of the components.
The measurement of toxins, such as arsenic, mercury, cadmium, lead and chromium for example, in food, beverages, environmental samples, waters, etc. must be carried out to verify that there is no danger. This requires analyses of numerous samples each day using instruments that can measure the small amounts that may be present.
The project concerns big data analysis and management for food testing industries by using Machine learning technique. Any food product before and during its distribution to the market must go through an extensive safety and quality tests at food testing laboratories. During this process, various microbiology and analytical chemistry tests will be performed on food products to make sure about the safety and quality of the food products before offering them in the market.
Across industries, many engineering documents and drawings have accumulated over the past few decades. However, they are mostly archived in paper or rudimentary electronic form (typically in an image or PDF format), rendering information retrieval highly inconvenient. As such, a lot of valuable engineering data have been left unutilized or at very least, difficult to access. Unfortunately, the existing open source tools do not offer a simple remedy.
The explosion of popularity of deep learning owes a lot to the success of convolutional neural networks, widely used in diverse fields including computer vision and natural language processing. Recently, the group equivariant convolutional neural network (G-CNN) was introduced, where equivariance of symmetries inherent in the data set is built in the architecture of the networks.
Data centers are now growing and expanding massively. They are large scale and heterogeneous. In addition, they rely more and more on emerging technologies such as Software Defined Networking (SDN) and Network Functions Virtualization (NFV) with network softwarization as their key feature. Moreover, they are now being augmented with edge data centers rooted in concepts such as cloudlets, ETSI Mobile Edge Computing (MEC), and fog computing. Such data centers bring a host of new challenges when it comes to the automation of configuration management and deployment.
The rapid development of technology and its ubiquitous integration with modern society has clarified for business and technology leaders the need for ethical conversations about the role technology can and should play in our lives. By interviewing primary stakeholders in the technology sector and putting their ideas in conversation with literature about the role of ethics in tech, this project will foster a critical dialogue between businesses, governments and users to overcome the ethical challenges posed by technological innovation.
Machine learning can be used to predict employee events around retention, promotion or movement. This project explores how to generate better predictions by exploring correlations and exploiting them through features that increase predictive strength. Furthermore, the project explores how to reliably fine-tune the predictive model to a particular data set in the presence of interdependence of data points. The results will enable improved Machine learning predictions related to employee events.