Big data is a collection of data sets so large and complex that it becomes difficult to process using on-hand database management tools or traditional data processing applications. The challenges include capture, curation, storage, search, sharing, transfer, analysis, and visualization. In this internship, we analyze a real-world big data set(s) to make sensible inferences by taking into account a selected range of criteria. A number of methods and algorithms are investigated, evaluated and evolved to advance the development of specialized tools and processes.
To research will help Tap for Tap to generate and collect information on user behavior that can help improve their matching algorithms. The core of the work consists in analyzing a large dataset of app users’ responses to advertising. Clearly, much of this search for determinants of successful matching will be statistical in nature. Still, this search will also be guided by an alert eye on the mechanisms that may lie behind the relationships uncovered in the data.
For producers of film and TV, extending a brand into the digital space can be a daunting enterprise. Estimating how successful a Twitter campaign or an Alternate Reality Game has been in fostering the proliferation and audience goodwill towards a TV or Film entity can be a frustratingly inscrutable endeavor. Even if a campaign seems to have been a critical success--the copious amounts of textual and statistical data at-hand can make it nearly impossible to find out what went right--less, what went wrong. ARGO seeks to solve this problem.
Developing subject-based social media requires proper interaction with the users by learning and analyzing their profile and dynamically incorporating their opinion. In this project, a crowd-based profiling of users with respect to their peers will be developed and confidence factor calculated based on the collective opinion of the crowd will be assigned to the individuals’ opinion. There is a prototype system already developed as a proof of idea. The intern will enhance the prototype by adding new functions and modules to it.
In this research, we propose the design of an efficient automated system which can effectively learn from the patterns of malicious activities, in the form of distributed denial of service attack (DDoS) against web servers on the Internet, and subsequently offer potential victims protection against such attacks by banning such requests in advance.
In the modern world people want fast access to the right data at the right time. The facilitator of this data transmission and interconnection is the internet. People are sending and receiving data through their different devices. In this so called ubiquitous era service providers are moving from in shop servers to cloud environments and from “multiple implementations for different devices” to a “one application fit all devices” paradigm. These paradigm shifts while attractive, opens systems to previously unknown security risks.
Machine-to-machine (M2M) devices are defined as the equipments which do not require a direct human interaction for communicating to each other. M2M networks are predicted to have a large end-user market in the near future with numerous potential applications, such as home automation, patient monitoring, transportation, and smart metering. Currently, the main bottleneck is to reduce the overall cost of these equipments in order to enable a practical implementation of densely-deployed M2M networks which can cover an area of interest and, for example, connect different regions of a city.
STMicroelectronics has designed a new multicore processor STHORM, which is now going into production. This system promises significant improvements in performance per unit of energy, but poses new challenges in programming. Unlike conventional processors, STHORM does not automatically move program data between various levels of the processor’s memory system. This is one of the reasons why it is more energy efficient, but the onus of data management falls of the programmer.
This project aims to develop and evaluate new geometric modeling algorithms of power line and building objects, which are required for conducting power-line related asset risk analysis in challenging environment considering wind blowing effects.The project also aims to integrate newly developed algorithms into York University’s in-house power line modeling test bed and evaluate their performance using GDI’s extensive inventory data.
This project seeks to develop GPU based versions of CPU implemented Subsurface scattering shaders. Rendering in video games and movies require increasingly complex simulation of skin and other effects from translucent materials As a visual effects solution provider, Cebus Visual Technology is very interested in implementing their shaders on the GPU to increase the computational throughput. Such an experience would be beneficial for an internal student.