Advanced Biometric Techniques
This project is investigating various biometric sources (face, iris, hand, fingerprint, and gait) and sensors (2D, 3D, infra-red, ...) with the goal of developing more accurate biometric techniques. Our research group is supporting the government programs on the Gait Challenge Problem, the Face Recognition Grand Challenge, and the Iris Challenge Evaluation. Faculty: Bowyer, Flynn
Algorithms and Software for Problems in Radiosurgery, Radiation Therapy, and Other Medical Applications
This project is on the design, analysis, implementation, and experimentation of new algorithms and software for solving geometric optimization problems arising in radiosurgery, radiation therapy, and other related medical applications. A key step in radiotherapy and radiosurgery is to develop a treatment plan that defines the best radiation beam arrangements and time settings to destroy the target tumor without harming the surrounding healthy tissues. At the core of the planning process is a set of substantially challenging geometric optimization problems. We have been investigating a number of such geometric optimization problems, such as beam selection, beam shaping, surgical navigation and routing, sphere packing, shape approximation, leaf sequencing, field covering and partitioning, image segmentation, and beam source path planning. This is a joint project with the Department of Radiation Oncology, University of Maryland School of Medicine. Our goal is to incorporate our new algorithms and software into clinical radiation treatment planning systems for treating cancer patients. Faculty: Chen
In September of 2008, the National Human Genome Research Institute (NHGRI) and the National Institute of Allergy and Infectious Diseases (NIAID) of the U.S. National Institutes of Health approved funding for the sequencing of the genomes and transcriptomes of thirteen Anopheles species, as described in the white paper. Since the initial white paper was approved, two additional species have been added to the project. This project, coordinated by ND biologist Nora Besansky, was inspired by very ambitious goals: improved understanding of vectorial capacity, and the application of that understanding toward reducing malaria disease burden. The Notre Dame Bioinformatics Lab, in collaboration with Prof. Besansky, will help uncover the evolutionary genomics of the An. gambiae species group and the larger set of species in relation to malaria. Faculty: Emrich
Automatic Emotion Detection
The goal of this project is to develop computer systems that automatically sense when a user is bored, confused, frustrated, etc., by monitoring facial features, speech contours, body movements, interaction patterns, and physiological responses. The emotion detection systems will be integrated into computer interfaces in an attempt to provide more effective, user-friendly, and naturalistic interactions. Several projects focusing on either adapting existing emotion detection systems or investigating new modalities for emotion detection are available. Faculty: D'Mello
Compucell: Computational Methods for Simulation of Biological Development
We are creating a model that includes how genetics at the subcellular level interacts with biophysics at the cellular level to orchestrate the development of organisms. This model is implemented in a software package called CompuCell. Users define the model to simulate using BioLogo, a domain specific language that generates a simulation package for the desired model. We also work with biologists, physicists, and mathematicians in the development and validation of simulations of chicken limb development as part of a National Science Foundation Biocomplexity project. Faculty: Izaguirre
Real-time embedded systems can be found in many applications such as communication devices, transportation machines, entertainment appliances, and medical instruments. This research targets at two important problems in real-time system design: performance analysis and scheduling algorithm design. Our current focus is on dealing with uncertainty and flexibility presented in many real-time control applications. Faculty: Poellabauer
Research in biometrics depends upon the effective management and processing of many terabytes of digital data. Because these workloads are so data intensive, they are very challenging to scale up to large clusters and grids. To address this, we are designing a data repository and web-enabled tools that simplify browsing and processing large data sets. Our work has produced some of the largest data analysis results to date in the field, reducing the execution time of some problems from years into days. Faculty: Flynn, Thain
Debugging large computing grids is notoriously hard. What can an end user do when a workload of millions of jobs experiences thousands of failures? We propose that data mining techniques are an effective way of explaining what happens to large workloads in grids. We are building and deploying tools that explain failures in computing grids of thousands of processors. Faculty: Chawla, Thain
Problem: Most projections of CMOS technologies perceive an ultimate limit of about 0.05 micron feature sized devices in about 10 years." The QCA Solution: Utilize a new technology termed Quantum Cellular Automata (QCA) to build real computers orders of magnitude denser than the limits of CMOS from molecularly sized devices where information is moved by Coulombic interactions rather than current flow. Faculty: Kogge
This project is developing an integrated Wireless Phone Based Emergency Response System (WIPER) that is capable of real-time monitoring of normal social and geographical communication and activity patterns of millions of wireless phone users, recognizing unusual human agglomerations, potential emergencies and traffic jams. WIPER will select from these massive data streams high-resolution information in the physical vicinity of a communication or traffic anomaly, and dynamically inject it into an agent-based simulation system to classify and predict the unfolding of the emergency in real time. The agent-based simulation system will dynamically steer local data collection in the vicinity of the anomaly. Multiple distributed data collection, monitoring, analysis, simulation and decision support modules will be integrated using a Service Oriented Architecture (SOA) to generate traffic forecasts and emergency alerts for engineering, public safety and emergency response personnel. Faculty: Madey
This project consists of an interdisciplinary team of environmental (biology, chemistry, geology) and IT scientists that is developing a stochastic model for the time-dependent evolution of NOM in the environment. The scientific objectives are to produce both a new methodology and a specific program for predicting the properties of NOM over time as it evolves from precursor molecules to eventual mineralization. The methodology being developed is a mechanistic, stochastic simulation of NOM transformations, including biological and non-biological reactions, as well as adsorption, aggregation and physical transport. It employs recent advances in agent-based simulation, web-based deployment of scientific applications, a collaboratory for sharing simulations and data, and scalable web-based database management systems to improve the reliability of the stochastic simulations and to facilitate analysis of the resulting large datasets using datamining techniques. Faculty: Madey
A collaborative research that aims at developing scheduling algorithms to minimize the energy/power consumption of real-time embedded systems. Some techniques being considered include Dynamic Voltage Scaling (DVS), Dynamic Frequency Scaling (DFS), Sleep Mode Control (SMC), etc. Faculty: Chen, Hu
From Computational Discovery to Privacy Preservation in Social, Product, and Health Networks
The fast emergence of interaction networks brings a rich source of information not only about the nodes in such networks, but also about the relationships between them. This notion of collective intelligence opens up a new tremendous source of data which can vastly improve decision making in many domains, be it advertising, the effectiveness of offering related products, or the ability to diagnose medical conditions. Such networks, however, carry enough information about the entities participating in such networks to compromise their privacy. Now a node can expose not only its own information, but information about its neighborhood as well. The goal of this project is to extract rich information available in product, social, and medical networks while preserving the privacy of their users. We allow users to have control over what information about them can be released and secure the data through cryptographic means. Faculty: Blanton, Chawla
Intelligent Edge Devices
In this research, we design and implement prototypes of intelligent network edge devices such as routers, modems, or wireless base stations. The goal is to build sophisticated network and system management stations which exploit their location at the edge of a network and their ability to communicate directly with their end devices to provide efficient and centralized resource and network management functionalities. Faculty: Chawla, Poellabauer, Striegel
Judicious Resource Management
This project studies the complex relationships between resources and the effect resource adaptation has on application performance. This work is driven by the insight that careless (non-cooperative) adaptation of multiple resource or multiple communicating devices can lead to sub-optimal savings in resource utilization or degraded application performance; effects which are often difficult to capture with theoretical models alone, thereby requiring extensive experimental studies. Faculty: Poellabauer
Many problems in science and engineering can only be solved by harnessing large collections of computers called cluster, clouds, or grids. Unfortunately, these systems are very challenging to use, particularly for data intensive applications. To address this, our lab is designing new languages and systems that allow end users to easily specify and execute workloads that run on hundreds of processors. Faculty: Thain
Adding an Energy Gear to High Performance Embedded Systems. Faculty: Brockman, Kogge
With the current trend of rapidly increasing CPU speeds and ballooning RAM capacities, the bottleneck between the processor and main memory is becoming more and more costly. PIM is an attempt to solve this problem by combining processor and memory macros on a single chip. The benefits of such an architectural shift include very high bandwidth and multi-processor scaling capabilities. These possibilities and more are being enthusiastically explored by our group. Faculty: Brockman, Kogge
We are developing multiscale methods for simulation of proteins and other biological molecules. These methods are useful for understanding important post-human-genome biological questions, such as the folding pathways of proteins or the relationship between structure and function. Our goal is to provide algorithms that scale with system size and simulation length. We also provide efficient and extendible implementations of these algorithms using a high performance object-oriented framework called ProtoMol. This project is funded by the National Science Foundation. Faculty: Izaguirre
The objective of this project is to overcome the limitations of mobile wireless devices by allowing them to spontaneously request access to resources (such as storage, CPUs, network bandwidths) and information (e.g., obtained from sensors) residing on wireless peers in their proximity. The project addresses the resource-efficient and reliable discovery and access of such resources and information. Faculty: Poellabauer
We are creating a series of cloud-enabled bioinformatics tools through Biocompute, which is a web-based tool that leverages grid-computing resources for solving large bioinformatics problems. Biocompute divides large bioinformatics jobs into hundreds of smaller jobs that are sent to either a large collection of personal computers or an external cloud. This drastically reduces the time required to complete desired tasks. Further, each Biocompute user has a personal workspace where he or she can store and share files and results with others. Users can create custom databases to run jobs on or they can use public databases. Biocompute is maintained by the Cooperative Computing Lab and is supported by the Bioinformatics Core Facility at the University of Notre Dame. Faculty: Emrich, Thain
Secure and Reliable Computation Outsourcing
This project allows resource-constrained devices (limited in their battery life or computational capabilities) to use external computing resources such as supercomputers or grids to carry out their extensive computational tasks in a secure and reliable way. Secure computation means that powerful helper machines do not learn any information about the data being processed yet help a weak device to compute its task; and reliable computation means that the device is able to verify that the computation was indeed performed correctly without the need to recompute the task itself. We develop cryptographic techniques to securely outsource different types of computation such as biometric and DNA comparisons and others. We also use the fact that helper servers do not obtain access to the data they are processing to allow the device to detect deviations from the prescribed computation more effectively. Faculty: Blanton
Sensor Networks
The objective of this project is to develop techniques for the flexible and efficient collaboration among sensors in a wireless sensor network, including techniques for energy-efficient real-time routing of sensor data or resource-efficient in-network data aggregation and fusion. Faculty: Chawla, Poellabauer
We try to develop tools that simplify the design and implementation of high-performance software, utilizing object-oriented and generic programming in languages such as C++ and Eiffel. Examples are tools to automatically generate tests from semi-formal specifications of programs that are part of the code itself. We use Eiffel and design by contract. We also are trying to build self-adaptive programs that can choose the best algorithms and parameters for particular problems at run time. Faculty: Izaguirre
The SPANIDS project is developing a scalable architecture for real-time intrusion detection on high-speed networks. We are developing techniques that exploit concurrency in network traffic to distribute high-bandwidth network streams across several sensors to significantly improve the capabilities of existing NIDS systems. We implemented a prototype system built from scalable inexpensive hardware and open source software that demonstrates the scalability and flexbility of our approach. Faculty: Freeland
|