Optimization in multi-currency transaction system

PeerFX Inc. provides a peer to peer currency exchange platform that brings the small business client and individuals together and allows them to exchange currencies with each other. Rather than being charged a high spread in a bank, the users save significantly in this transaction system. However, the demand and supply of currencies normally cannot be balanced merely by the clients. PeerFX should exchange the extra supply of currencies for other deficient currencies with a third party, currency exchanges.

Mathematics in Canada to 1980

The overall aim of the project is to produce a survey history of mathematics in Canada to be used as a Historical Assessment by the Canada Museum of Science and Technology. This will concentrate primarily on the period following European colonization, in both French-speaking and English‐Speaking contexts. The research team proposes to cover the period up to roughly 1980. The work is to be designed expressly to facilitate the development of museum collections on mathematics and the mathematical sciences, and will aim to identify desiderata for such collections.

Quality Assessment of JPEG and JPEG2000 Compressed Medical Images

The goal of this project with Agfa Healthcare, a provider of diagnostic imaging and healthcare IT solutions, is to develop mathematical models for (i) assessment and (ii) improvement of compressed medical images. The rapidly increasing volume of data generated by new imaging modalities (eg CT scanner, MRI) necessitates the use of lossy compression techniques to decrease the cost of storage and improve the efficiency of transmission over networks. Increasing the degree of compression of an image, however leads to decreasing fidelity.

Information Extraction from Unstructured Data

Information extraction from unstructured data is a wide and relatively recent domain. For this research project, the focus will be on the information extraction from finance reports and news, more precisely related to the commodities market. This includes Natural Language Processing (NLP), expert systems (such as ontology-based systems) and information fusion as tools for analysing qualitative information in finance and producing investment decisions. NLP is a science studying the automated understanding of natural human languages.

Effects and Benefits of Discrete Cosine Transform on Polarimetric Decompositions: Application to Man-made Object Recognition

In this project, the intern will use a recently developed interpolation technique based on the Lie group theory to enhance the quality of the classification of Synthetic Apertur Radar images. He will evaluate the effects and discuss the benefits of this interpolation on the complete set of polarimetric features extracted from a fully polarimetric SAR image. The effects on the polarimetric features will impact the performance of target recognition algorithms applied to the detection of man-made ground targets. The recognition process will be achieved by using a neural network.

Cluster and Discriminant Analysis for Vehicles Detection

It is very useful to build an automatic computer system to recognize the types of vehicles passing a checkpoint given some easy-to-get data about the vehicles, such as the distances between axles, the weights on each axle. Such a system has many applications, for example, in monitoring traffic volumes and identifies the type of vehicle, which will be helpful in budgeting road maintenance costs. The main goal of this project is to develop a better methodology for cluster analysis with application to the vehicle detection problem. The simplest clustering technique is the K-means clustering.

Comparison of Scalable Analytics Correlation and Classical Correlation in High-Frequency Finance

Measures of covariance and correlation between returns of different financial assets are of great interest in the finance industry. In the high-frequency domain, raw price data is filled with numerous bad data points to which traditional definitions of correlation by Pearson are very sensitive to these outliers, and thus should not be directly applied to raw high-frequency data. Robust measures of correlation less sensitive to outliers can be used to improve the performance of popular financial methods.

An Algebra-Based Approach to Program Verification

We want to study the feasibility of using an algebra-based approach to performing program analysis. In particular, we are interested in the code comparison problem: tackling this problem enables us, among other things, to identify differences between two versions of a code or to check whether an optimized code is equivalent to its non-optimized version. We aim to reduce the comparison of programs to a simple algebraic manipulation similar to those that are constantly performed in classical algebra.

Portfolio Management based on the Stochastic Portfolio Theory

The stochastic portfolio theory developed by Robert Fernholz is a mathematical framework used to build portfolios and analyze their behavior as well as the securities market structure. This new theory is consistent with market behavior. Portfolio generating functions are the focus area for the internship as they constitute versatile tools to build portfolios with specific properties.

User Modelling and Feature Selection for Personalized Local Search

Search engines, such as Google, have revolutionized the way we search for electronic information, providing a user with a ranked list of documents most relevant for a particular query. This project with GenieKnows R&D, a search engine company, concerns an extension of this basic technology, in which the goal is to incorporate geographic constraints into the search (e.g. find coffee shops near the Halifax Citadel). An additional challenge is to learn and incorporate user preferences when ranking the results returned.

Pages