Denial of service attacks deny a service, such as visiting a website or access to a network, by deliberately congesting the server or the network resources. In addition to delivering digital contents to end-users, content delivery networks (CDNs) are supposed to protect the content origins, such as Netflix or Amazon Video, against denial of service threats. However, denial of service attacks not only evade a CDNs protection but also exploit its resources to damage content providers and the CDN itself. As such, traditional security mechanisms are no longer sufficient.
Pintellect is Enterprise Social Software that gives employees access to the thoughts and ideas of the organizations influencers by encouraging them to share links to the internal files or external resources such as books, TED talks, podcasts, articles, etc. The objective for this project is to develop multiple algorithmic solutions for curated feed of content by department on the dashboard based on number of identified criteria.
Content Delivery Networks (CDNs) represent the up-to-date standard to transfer data through on-growing Internet. They are designed to manage traffic streams to avoid network problems. Despite the fact that CDNs attempt to satisfy security requirements (authentication, data privacy and integrity), they face rising innovative threats, observable in the cyber-space. The main objective of this project is to design, implement and test new methods to detect and prevent maliciousness in CDNs. We aim at building an alternative solution to classical Web Application Firewalls (WAFs).
We are in the process of creating and growing a team of researchers expert in the field of machine learning and data-mining. Ultimately, our aim is to create solutions to eliminate the need to manually define personalization strategies. We are in the process of signing partnership agreements with retailers capable of collecting large-scale datasets of customer behaviour. Through a data-sharing/consulting partnership we plan to perform research on the design of recommender systems customized for the data-sets available to brick and mortar retailers.
Networks are ubiquitous in our world. In broad terms, a networked control system consists of sensors, actuators and controllers interconnected and coordinated through a communication network. Networked control with distributed intelligence can open new directions in the industry of robotic entertainment allowing for pursuer-evader games to be played with multiple robots. The research proposed here will give a first step in this direction.
Environmental modelling and monitoring software systems, which are very important in assessing the effects of climate change, require open data from a large number of sources including all levels of government, NGOs such as watershed management authorities, consultants and business. This data needs to be brought together into internal databases and to be kept up-to-date to perform the required underlying computations. Collecting this data manually and keeping it current requires an incredible amount of error-prone manual labour.
Deep learning is currently the dominant machine learning technique as a result of state of the art performance in vision (Russakovsky, et al., 2015), speech (Amodei, et al., 2015) and natural language processing (Vinyals et al., 2015). The improvement in performance of these models is attributed to the availability of large datasets for training the models as well as software & hardware improvements that help accelerate the training process. Recurrent Neural Networks (RNNs) are one of the most powerful and popular frameworks for modeling sequential data such as speech and text.
Project is to import ten years of historical data on news events, public sentiment metrics and the price movement of S&P 500 related equities for study and analysis through the latest Data Mining and Machine Learning techniques. The goal is to uncover correlation and causality between events and price movement of global markets in multiple timeframes (three hours, daily, weekly, monthly and yearly). Specifically, the research would answer the question which features (metrics) generated from initial news and sentiment data have predictive power and which don't.
Software failure may result in substantial damage, especially to human life and financial loss. High-quality software is recognized as a product that has been specified correctly, and that meets its expected specifications. It is important that the quality characteristics be specified, measured and evaluated. In this internship, the primary objective is to create the software quality deviation artifact through comparing the user expected quality against the final observed quality of a software product. For this purpose, the quality measurement process is focused.
We are in the process of creating and growing a team of researchers, expert in the field of machine learning and data-mining. Ultimately, our aim is to create solutions to eliminate the need to manually define personalization strategies. We are in the process of signing partnership agreements with retailers capable of collecting large-scale datasets of customer behaviour. Through a data-sharing/consulting partnership we plan to perform research on the design of recommender systems customized for the data-sets available to brick and mortar retailers.