Workflow management frameworks support the creation of task dependencies and make efficient use of resources while running those workloads. Typically, these tasks can be long running processes like machine learning algorithms or access data from databases. Workflow management consists of mapping tasks to suitable resources and the management of workflow execution in a cloud environment. The goal of this project is to optimize the job scheduling algorithm using machine learning techniques in a workflow orchestration framework that manage workloads across a heterogeneous system.
Search is an important way people get the information they want. Whether we want to find more content about a specific topic, or get general information on a subject, search engines lie at the core of this process. At Flipp, search plays a crucial role in the overall user experience and drives relevant content to consumers. Consequently, improving search by assisting consumers in finding a larger volume of relevant products will be of growing importance to Flipp. The proposed project aims to improve Flippâs search experience by achieving greater relevancy, volume and ease of use.
Machine learning is a discipline of teaching computers repeatable tasks that humans do well but slowly. At Interdata we are on a mission to use Artificial intelligence to understand the data being stored by organizations and the relationships between those data assets. As such Darrell will be working on methodologies and tools to expand our understanding of the algorithms we develop in order to improve them. He will then use those methodologies and tools to engineer new algorithms to be used by the organization to categorize and tranform data.
As a result of recent advances in high-throughput technologies, rapidly increasing amounts of mass spectrometry (MS) data pose new opportunities as well as challenges to existing analysis methods. Novel computational approaches are needed to take advantage of latest breakthroughs in high-performance computing for the large-scale analysis of big data from MS-based proteomics. In this project, we aim to develop new applications of deep learning and neural networks for the analysis of MS data.
Recent advancement in computer vision and sensing technology has shown great potential for autonomous vehicles. This work aims to studies using an Augmented Reality heads-up display to improve the reliability of Advanced Driver Assistance Systems (ADAS). The algorithms developed will help drivers detect obstacles e.g. pedestrians crossings, improve current lane departure warnings to allow a car to navigate safely without, as well as estimate distance to nearby vehicles for collision avoidance.
This is a research and development (R&D) project. The primary objective is to develop a Smart Learning and Course Management System with a pilot demonstration of an entrepreneurship development course. The target audience for the course is students interested in learning the fundamentals of running a startup company. The secondary objective is to document and inform the development and learning analytics through design-based research.
Unlike services provided by current mobile networks only focus on voice and data, the services provided by 5G networks can range from high data rate services (e.g., VR) to ultra-reliable low latency communication (e.g., vehicle communication). Therefore, a more flexible and effective 5G mobile network is expected to be designed and operated. Network virtualization and slicing has been proposed to address these challenges to enable a new way to design, deploy and manage networking services. Can we implement network virtualization and slicing into 5G networks directly?
Laser cladding is an additive manufacturing technology for applying high-quality metal coatings to parts in order to improve their mechanical wear properties, and thus increase their lifespan. Currently, these metal coatings are created by depositing metal powders on the work piece and welding them together with the laser. A significant amount of powder is lost in this process, which is a large factor in the cost of this type of cladding.
Large software systems are updated incrementally to add new features or fix bugs. It is a common practice in the software industry to have each incremental change reviewed by a peer to detect software quality issues and transfer knowledge among team members. While peer review boasts technical and non-technical benefits, it is still primarily based on low-level textual differencing, which place the prior and updated versions of the software source code next to one another.
The proposed research project targets anomaly detection of event data. The project has a duration of six months and aims to achieve two objectives: (1) to evaluate the effectiveness of a novel approach for real-world data, and (2) compare it to alternative methods. The intern will use existing research resources, and will apply them to real-world data provided by the partner, Acerta Analytics Solutions, Inc. to evaluate the different methods.