Innovative Projects Realized

Explore thousands of successful projects resulting from collaboration between organizations and post-secondary talent.

13270 Completed Projects

1072
AB
2795
BC
430
MB
106
NF
348
SK
4184
ON
2671
QC
43
PE
209
NB
474
NS

Projects by Category

10%
Computer science
9%
Engineering
1%
Engineering - biomedical
4%
Engineering - chemical / biological

High throughput functional proteomics for surface proteins on mouse embryonic stem cells

Proteomics is a high throughput analysis of the structure and function of proteins in biological systems. Our glycoprotein-targeted proteomic analysis employs liquid chromatography (LC) as a separation tool in line with mass spectrometry (MS) as an analyser to identify and quantify glycoproteins and their site of glycosylation. In a typical analysis, proteins will be extracted from biological samples, digested into peptides enzymatically, and glycosylated peptides will be chemically enriched. The N-linked glycopeptides will be selectively analyzed through LC-MS. Thousands of peptide MS spectra can be acquired in a short duration and processed by bioinformatics tools to infer protein structure and function.

This project will focus on improving the sensitivity of glycoproteomics method that we have developed previously (Sun, et. al., MCP, 2007). Successful remedies will be deployed to study membrane proteins of mES cells. Our lab has extensive experiences with these cells, and we have both host and gene knockout mES cells in culture. The new knowledge gained from these newly developed sensitive analyses will be evaluated easily through comparison to our curated databases in identifying stem-cell specific protein markers. Connections drawn between surface molecular topology and intracellular protein networks could also provide us novel insights into stem cell biology and physiology.

Exicuting such project will provide students hands on experiences on all aspects of proteomics technology, including protein chemistry, liquid chromatography separation techniques, tandem mass spectrometry, and data-analysis skills. Our lab also harbours various molecular and cellular techniques, such as mammalian cell culture and assays, immunoassays, fluorescence imaging, and single-cell assays, which can be acquired during training. Successfully trained students will have opportunity to continue the project as a graduate student.

View Full Project Description
Faculty Supervisor:

Bingyun Sun

Student:

Partner:

Discipline:

Sector:

University:

Simon Fraser University

Program:

Globalink Research Internship

Consistency of local classifiers under exchangeable data inputs

Traditionally, proofs of universal consistency of particular machine learning algorithms – including local learning algorithms such as k-NN classifier – are given under the assumption of data inputs being independent identically distributed random variables. This assumption is often too strong, for instance, when modelling learning and generalization of time series. A sequence of random variables X_1, X_2, … is called exchangeable if the joint law of any finite subsequence is invariant under permutations of members of the subsequence. For instance, if we have two biased coins, A and B, with different probabilities to get tails, then an exchangeable sequence of random variables is obtained by tossing either A or B, and a decision on which coin to toss being made each time by of tossing a third coin, C. A famous De Finetti theorem states that in essence every exchangeable sequence of random variables is obtained like this, as a mixture of a family of i.i.d. sequences. The notion of learnability has to be modified if the data inputs are assumed exchangeable, and in the earlier work of the project supervisor [V. Pestov, Predictive PAC learnability: a paradigm for learning from exchangeable input data. – In: Proc. 2010 IEEE Int. Conference on Granular Computing (San Jose, CA, 14-16 Aug. 2010), pp. 387-391, Symposium on Foundations and Practice of Data Mining. doi> 10.1109/GrC.2010.102] it was shown that a “conditional”, or “predictive”, version of a probably approximately learnable concept class behaves well for such input variables and satisfies an analogous result to its classical i.i.d. counterpart. The next interesting question it to reformulate the notion of universal consistency, and to give a proof that a “conditional”, or “predictive” universal consistency of the k-NN classifier holds under exchangeable inputs. Moreover, as exchangeable inputs can be easily simulated, it would be very interesting to stage a large-scale experiment using the High Performance Computing Virtual Laboratory (HPCVL) in order to test the asymptotic performance of the learning algorithm under such inputs and compare it to the performance under the traditional i.i.d. inputs. Thus, the research problem has both a theoretical and a practical dimension, which can be pursued in parallel or even independently of each other. This is a rich project, sufficient not only for a Summer MITACS project, but for a good Ph.D. thesis as well!

View Full Project Description
Faculty Supervisor:

Vladimir Pestov

Student:

Partner:

Discipline:

Sector:

University:

University of Ottawa

Program:

Globalink Research Internship

Secure data aggregation in unattended wireless sensor networks

Wireless Sensor Networks (WSN) contain large number of tiny, low cost sensors. Many sensor networks have mission critical tasks that involve data collection in remote, inaccessible or hostile environments, such as battle fields, deserts, mountains, etc. These sensors are normally monitored and managed by a trusted authority commonly known as sink or collector. In certain special classes of WSNs, this sink may not be online all the time. It visits and collects information from the nodes at certain intervals. Such WSNs are known as unattended wireless sensor networks (UWSNs) [1]. Since the sink visits and collects information at intervals, every node has to secure the data until the next visit of the sink. Security needs should be taken into account to ensure data protection (also called data survivability), authentication of sensors.

Existing distributed security mechanisms for WSNs [2] are not suitable for the UWSNs due to infrequent visits of the sink. Cryptographic key management techniques provide data authenticity [3, 4] and integrity but do not ensure data survivability. Self-healing in UWSN has also been widely studied [5]. In self-healing techniques, nodes can regenerate keys and continue functioning normally after being compromised. Most of the existing schemes assume that the sensors are static between successive visits from the sink. Efficient data survivability in mobile UWN has not been well studied. In a large field it might not have efficient to visit all nodes. Secure data aggregation is an important problem in this regard. Hence, we will design efficient algorithms which will ensure secure data aggregation by special sensor nodes called aggregators. Sink can then visit only the aggregator nodes, instead of all nodes.

In this project, will be study the following problems:
1. Efficient data survivability in mobile UWSN, and
2. Efficient data aggregation with self-healing in UWSN

We will not only design algorithms, but also simulate these algorithms using simulators like NS2.

References:

[1] R. D. Pietro, L. V. Mancini, C. Soriente, A. Spognardi, and G.
Tsudik, “Data security in unattended wireless sensor networks”, IEEE
Transactions on Computers, vol. 58, pp. 1500-1511, 2009.

[2] X. Chen, K. Makki, K. Yen, and N. Pissinou, “Sensor network
security: a survey”, IEEE Communications Surveys Tutorials, vol. 11,
no. 2, pp. 52-73, 2009.

[3] R. D. Pietro, C. Soriente, A. Spognardi, and G. Tsudik,
“Collaborative authentication in unattended WSNs”, in ACM WISEC, pp.237-244, 2009.

[4] T. Dimitriou and A. Sabouri, “Pollination: A data authentication
scheme for unattended wireless sensor networks”, in IEEE 10th
International Conference on Trust, Security and Privacy in Computing
and Communications (TrustCom), Nov. 2011, pp. 409-416, 2011.

[5] R. D. Pietro, D. Ma, C. Soriente, and G. Tsudik, “Posh: Proactive
cooperative self-healing in unattended wireless sensor networks”, in
IEEE SRDS, pp. 185-194, 2008.

View Full Project Description
Faculty Supervisor:

Amiya Nayak

Student:

Partner:

Discipline:

Sector:

University:

University of Ottawa

Program:

Globalink Research Internship

Looking for drugs in all the right places: Drug sewage “epidemiology”

We will determine use levels and patterns of selected pharmaceuticals in the Manitoba population and selected subpopulations, and to investigate the magnitude of such use during episodic events (e.g., resorts occupied only part of the year, with likely different drug use patterns). Our hypothesis is that an appropriate analysis of surrogate materials, namely wastewaters, provides a realistic measure of the amount and types of many drugs used by the public, and the extent to which these drugs may contaminate surface waters receiving wastewater discharges through tracer analysis.

Characterizing drug usage helps ascertain overall public health status, as it estimates how much drugs are needed for ailments. However, typical means to assess this information, such as sales and prescription records, have several major disadvantages, e.g., they do not indicate actual use. Questionnaires depend on accurate reporting, which may not be forthcoming for various reasons (e.g., privacy concerns).

Because drugs are incompletely metabolized, they are excreted into sewage systems, and thus go into wastewater treatment facilities.
Indeed, some household chemicals, such as the artificial sweetener sucralose, are poorly metabolized and appear to be environmentally stable, so its presence in the environment can indicate contamination by wastewaters. Sewage measurement would provide an objective, anonymous (i.e., no personal information needed, therefore protecting privacy), and aggregate means to assess drug consumption, if residues found in wastewaters could be correlated to use. However, current measurement efforts can be quite inaccurate. For instance, drug use can vary by season or even time of day. Most monitoring studies simply grab water, and thus do not accurately capture common short-term concentration changes.

We will estimate overall levels of Manitoba drug usage through analysis of wastewater influents in selected communities representing a cross-section of the province. Urban areas are represented by Winnipeg’s treatment plants. Less populated rural communities are represented by the sewage lagoons of Morden and Winkler in southern Manitoba, site of previous work by a MITACS student. Areas with significant seasonal variations in occupancy and therefore likely different drug usage, such as recreational and resort regions, are represented by Grand Marais near the shores of Lake Winnipeg.

Unlike previous efforts, we will take advantage of passive sampling technology to obtain continuous, time-weighted-average concentrations of drugs, using appropriate calibration rates we helped develop. These samplers will be deployed in wastewater effluents and influents sequentially to determine spatial and temporal trends in concentrations measured by ultrahigh performance liquid chromatography-tandem mass spectrometry. With treatment flow rates, we calculate drug inputs and outputs, which we believe from earlier work to be constant per capita given ubiquitous use. Sucralose sewage loadings will determine how much “dilution” of sewage is present in surface waters that contain this chemical, helpful for mapping the extent of wastewater impacts in Manitoba.

This work will provide objective and overall data on drug use rates and patterns in the Manitoba population, which is currently lacking.

View Full Project Description
Faculty Supervisor:

Charles Wong

Student:

Partner:

Yes

Discipline:

Sector:

University:

University of Manitoba

Program:

Globalink Research Internship

Submicron resolution dielectric loss spectroscopy for a scanning probe microscope

The current focus of this research group is developing a scanning probe microscope based approach that will enable us to perform dielectric loss spectroscopy (i.e. loss tangent measurements) on thin film samples with submicron spatial resolution. Our goal is to develop this technique, based on a dynamic form of electrostatic force microscopy, into a useful addition to the suite of probing approaches available. Our efforts focus on composite polymer membranes that find use in a range of devices including an artificial photosynthesis system and other conducting polymer membranes such as those used in the fuel cells. Another key area of research is studying the potential for developing new materials for electrical insulation. In recent years much interest has been generated by the use of nanoparticles/nanofillers distributed through insulating polymers as a technique for modifying the dielectric constant of the insulator. Our interest is to look at whether these approaches lead to the development of local weak points due to the abrupt variation of dielectric character at submicron length scales. As the spatial resolution required is not attainable via conventional measurement approaches our new approach is well-placed to assist our collaborators in these studies.

View Full Project Description
Faculty Supervisor:

Derek Oliver

Student:

Partner:

Discipline:

Sector:

University:

University of Manitoba

Program:

Globalink Research Internship

Bayesian Methods for Meta-Analysis with Applications to Multi Arm Trials with Binary Outcomes

Recently, there has been a growing interest in meta-analysis in many areas including medicine, education, psychology, social sciences. In literature, there are two main approaches used in meta-analysis: the fixed effect model and the random effects model. Under the fixed effect model we assume that there is one true effect size which is shared by all the included studies. The combined effect is our estimate of this common effect size. If there is no statistical heterogeneity among studies, differences across studies may be due to random variation and fixed effects model may be appropriate. In random effects model, we assume that the true effect could vary from study to study. Muthukumarana and Tiwari (2012) developed a random-effects model using Dirichlet process priors to account for heterogeneity among studies. This project will focus on enhancing the methodology developed in this paper. More specifically, the methodology will be extended to multivariate version of random meta-analysis with binary outcomes. This extension is important when there are several zeros observed in some studies. This will compel to introduce zero-inflated Binomial (ZIB) models for meta-analysis.

View Full Project Description
Faculty Supervisor:

Saman Muthukumarana

Student:

Partner:

Discipline:

Sector:

University:

University of Manitoba

Program:

Globalink Research Internship

Development for the Symphony of Devices

The multitude of computing devices owned (and carried) by any given person is increasing. There is a clear need and opportunity to not simply replicate experiences across form factors, but rather to enable applications to easily span form factors. In the future, it is easy to anticipate that experiences will seamlessly grow and shrink by annexing nearby displays and input devices: the Netflix of the future is not one that can simply play the same movie on multiple devices. Rather, it is one where any device in a viewer’s pocket can serve as a remote control and supplementary information display for the screen showing the film. Two people trying to find a common date for a meeting will be able to effortlessly show an overlap of their calendars on a nearby screen. A user sitting at his laptop should be able to easily slide a table-of-contents page of a document to his iPad, and use it as an index to select which pages are shown on the PC. We term this personal computing experience the Personal Symphony of Devices (PSoD).

View Full Project Description
Faculty Supervisor:

Daniel Wigdor

Student:

Partner:

Yes

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship

Reducing NETosis for improving lung health

Background, knowledge gap and therapeutic value: Lung disease is the primary cause for the morbidity and mortality of patients with cystic fibrosis (CF). One of the major culprits of the CF airway diseases is chronic inflammation associated with dying neutrophils, accumulating DNA and colonizing Pseudomonas aeruginosa. However, the factors and pathways that regulate the pathological changes, particularly relevant to neutrophil death, in CF airways are unclear.

NETosis or the death of neutrophils by forming neutrophil extracellular traps (NETs) is a recently identified form of cell death. Our data strongly suggest that dysregulation of NETosis contributes to the accumulation of DNA and neutrophilic cytotoxic by-products in CF lung disease. We aim to inhibit this pathway to treat CF lung disease and to prevent the deterioration of lung health.

Overall objectives of our project are to identify the host and microbial factors that regulate NETosis in the inflamed lungs and to devise therapeutic strategies to suppress NETosis.

Specific aims and experimental approaches: (i) The first aim is to determine the key factors (host and microbial components) that regulate NETosis. We will use wildtype and mouse model with CF-like lung conditions to identify the host factors. To identify the bacterial factors, we will use clinical strains of bacteria and their components in NETosis assays. (ii) The second aim is to elucidate the relevance of these factors in CF lung disease and to suppress NETosis in CF-like mouse airways. We will use human (healthy, CF) blood, BAL and sputum samples and in vivo mouse models to achieve this aim.

Significance: Identifying molecules that suppress NETotic neutrophil death in the CF airways could help to devise novel therapeutics for treating CF lung disease.

View Full Project Description
Faculty Supervisor:

Nades Palaniyar

Student:

Partner:

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship

Multihop cooperative wireless communications

Collaborative network participants can jointly achieve advanced networking functions beyond simple relaying of data packets. They enable distributed network reconfiguration and autonomous tuning of software and hardware among peers, to support diverse and evolving application requirements and networking environments. However, the combination of mutual interference, network scale, decentralized control, and possible multihop radio instability, brings new challenges to the paradigm of intelligent collaboration in future generation wireless networks. We conduct research to create new theories and technologies toward promoting intelligent collaboration among the peer devices in a wireless system. This project aims to achieve efficient provisioning of resources and services, leveraging the benefit of joint-communication and joint-processing power of multiple wireless devices in proximity.

View Full Project Description
Faculty Supervisor:

Ben Liang

Student:

Partner:

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship

Mobility management for next-generation heterogeneous wireless systems

To support the ubiquitous availability of broadband applications and services, multiple network-access technologies, including the wired Internet and various wireless networks, are expected to co-exist and interoperate. This integration of heterogeneous access networks brings forth unique challenges in the design of multimedia applications and services, since no single network meets the ideal of high bandwidth, universal availability, and low cost. This project aims to provide innovative solutions to network inter-connectivity and wireless resource management, to allow efficient and transparent services to mobile users across heterogeneous networking platforms.

View Full Project Description
Faculty Supervisor:

Ben Liang

Student:

Partner:

Yes

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship

Scalable and Reliable Storage Virtualization

Two emerging technologies, virtualized and cloud storage, are transforming the storage infrastructure of computing systems. Both technologies enable increased resource sharing, promising more flexible, manageable, and cost-efficient use of storage resources. However, these shared storage technologies raise several challenges for storage providers. Storage systems must meet the competing and conflicting performance demands of customers. They must be designed for heterogeneous storage units with widely-varying characteristics, such as low-end and enterprise disks, and a variety of flash devices. Storage reliability and security become critical concerns because loss of data availability or data compromise can affect a large number of users, potentially damaging customers’ businesses. These and other requirements, such as reducing energy consumption, result in growing complexity of shared storage systems, with significant time and effort being spent designing, customizing, and maintaining storage solutions, both at the provider and customer ends.

Our key observation is that a policy-based architecture that allows flexible specification and enforcement of storage requirements is essential for managing the complexity of shared storage systems. We propose a storage architecture in which high-level policies allow expressing application-level requirements using a common management interface, and the system enforces these requirements by monitoring both storage requests and the dynamic characteristics of storage devices. Our policy-based storage system will adapt to changes in workload, hardware configurations, power consumption and load hot-spots, while providing reliable and secure storage to clients. It will help reduce storage provider costs, while enabling greater flexibility for their customers, thereby increasing the use of virtualization and cloud storage technologies.

View Full Project Description
Faculty Supervisor:

Ashvin Goel

Student:

Partner:

Yes

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship

Optimal adaptation of radiation therapy treatments

Intensity-modulated radiation therapy (IMRT) is an advanced cancer treatment technology that uses beams of high energy x-rays to deliver radiation to a tumour. In IMRT, radiation beams are divided into many small beamlets. The intensities of each radiation beamlet are computed using specialized software. In this software, the treatment planning problem is modeled as a mathematical optimization problem, and solved using mathematical algorithms.

Treatments need to account for potential uncertainties that may degrade treatment quality. For example, tumours in the lung move as the patient breathes, so when solving the mathematical optimization problem to design a radiation therapy treatment, such motion must be accounted for. My research group has designed novel robust optimization methods to optimize radiation therapy treatments subject to such uncertainties. Furthermore, we have developed adaptive methods that allow the treatment to be adapted to patient changes as the treatment progresses – treatments are normally spread over multiple weeks.

This Globalink project will build on existing research that is being conducted in my research group on adaptive and robust radiation therapy. In particular, there are two related problems that will be explored in this project. First, the question is “how often should a treatment be adapted”? Our initial research shows that treatments that adapt to uncertainty perform better than those that don’t, but the question of how often to adapt is still open. Frequent adaption likely leads to better clinical results, but ends up being quite costly for the hospital. To address the first question, we will start with an empirical analysis using historical patient data and exhaustive search to determine optimal treatment adaption times (retrospectively), given a budget of one adaption, two adaptions, etc. That is, if we are allowed to re-optimize the treatment once to adapt to observations of the uncertainty, when should we re-optimize. Guided by the empirical findings, we will develop a mathematical model that can be applied to future treatment cases to determine guidelines on treatment adaptation.

The second problem will focus on extending our previously developed mathematical models for adapting a treatment to consider incorporating previous dose information in a novel way. We have developed preliminary adaptive optimization models that account for previous dose information (e.g., if certain parts of the tumour are underdosed, then the model will focus on increasing dose to those regions in subsequent treatment days), but they require more extensive testing. We have hypothesized additional enhancements to the model that can more accurately measure previous dose delivered and adjust future dose requirements. We will implement these new algorithmic ideas, incorporate them into our previously developed models and test whether dose results are improved with the new model.

View Full Project Description
Faculty Supervisor:

Timothy Chan

Student:

Partner:

Yes

Discipline:

Sector:

University:

University of Toronto

Program:

Globalink Research Internship