Research Experiences for Undergraduates (REU)
Cybersecurity, Robotics, and Software Engineering
Each project will begin with an in−depth study of the scientific concepts that underlie each project. Students will also have the opportunity to observe the more theoretical activities of their graduate mentors and to see the results of their work in the broader context of the research objectives. Each student will also be required to work with graduate students in publication activities on the results of their work.
Adaptive Event Stream Processing for Security in SmartGrid
Software systems for contemporary applications are becoming increasingly complex, demanding the use of active, reactive, and proactive behavior to support context-aware and adaptive execution environments. To support these new application requirements, this research is transforming event stream processing into a dynamic and adaptive process through the integration of event detection with machine learning. This new approach to event processing involves the development of an adaptive environment for event detection that combines event query processing and logical inference with probability measures that can be used to detect well-known event patterns, to evolve existing event patterns, and to learn new and meaningful event patterns. At Texas Tech, we are investing the application of this research to security monitoring in the SmartGrid, which is a modernization of the current electrical grid using information technology to improve the efficiency and sustainability of electricity production and distribution. REU student projects will focus on refining an existing Zigbee home area network simulation for the purpose of developing and testing event detection patterns for intrusion detection in the Smart Grid and performing some of the statistical analysis and data mining that is needed to support the learning of new event patterns that can be used to increase security in the Smart Grid.
Software Specifications for Cybersecurity
The Descartes specification language research effort is one part of an overall software engineering research program in software requirements, specification, process, and environments. The Descartes specification language effort is based on a solid foundation of research that has resulted in graduate students who completed degrees and the involvement of three REU site project students and two REU supplement students. The research effort will focus on automated software specification generation from expected input and corresponding output that is a direct extension of earlier research on automated test data generation from Descartes specifications. There are extensions to the language for real−time, object−oriented, and intelligent agent software development that will support an effort on executable software specifications as formal methods for information assurance, investigated in the context of the MRI and SFS grants mentioned above for Susan Urban. Dr. J. Urban is the P.I. of the NSF SFS project.
Autonomy in Human-Robot Collaboration
Mobile robots equipped with multiple sensors and sophisticated algorithms receive far more raw data than is possible to process in real-time. In addition, unforeseen changes frequently make it difficult for robots to operate without any human supervision. At the same time, humans may not have the time and expertise to provide elaborate and accurate feedback in complex real-world domains. Dr. Sridharan's research group is developing an architecture that jointly addresses the learning, knowledge representation and interaction challenges in human-robot collaboration. The architecture will enable robots to:
- Autonomously and incrementally learn models of domain objects and events.
- Represent, incrementally revise, reason with, and learn from qualitative and quantitative descriptions of knowledge and uncertainty.
- Learn associations between multimodal (e.g., visual and verbal) descriptions of domain objects and events.
- Solicit and use high-level feedback from non-expert human participants when such feedback is necessary and available.
REU students will have the opportunity to participate in the design, implementation and evaluation of these algorithms in simulation, and on different mobile robot platforms. Key application domains include assistive care, surveillance and robot soccer. More information about the robotics research projects can be found online at redwood.cs.ttu.edu/~smohan/Research.html
Computational Modeling for Sustainable Management of Groundwater and Agricultural Production under a Changing Climate
Climate forecasts influence policies and planning in fields such as agriculture, ecological preservation and resource management. In this research project, we consider the Texas High Plains region where, for the past several decades, agriculture has relied on the Ogallala Aquifer system to mitigate the effect of large climate variability and a semi-arid climate. With unsustainable water withdrawals far exceeding annual recharge rates, producers are becoming increasingly dependent on rainfall at a time when climate change is leading to increasing temperatures, rising evaporation rates, and shifting of seasonal precipitation. Since existing policies are unable to assure the sustainability of water resources, agriculture, and the regional economy, we seek to provide comprehensive, workable strategies for sustainable water management and agricultural production in the Texas High Plains in the face of a changing climate. REU students will be involved in the design, implementation and evaluation of computational algorithms for agricultural irrigation management, yield mapping, and downscaling of climate models. For instance, students can develop stochastic machine learning algorithms that use historical data of weather observations and satellite images to accurately estimate the water requirements for irrigation and the crop yield. Students can also contribute to the development of deep architectures that learn the relationships between global models and regional observations, thus making accurate predictions of regional weather parameters.
DOROTHY: Design of Robot Oriented Thinking to Help Youth
Three-dimensional graphical programming environments such as Alice have been developed as an effective way to stimulate interest in computing by teaching students how to program. Students have also embraced robotics as a means to learn computing because robots illustrate practical applications of computing. In collaboration with five REU Site project students, we have developed a novel educational tool known as DOROTHY that integrates Alice with autonomous robots, with bidirectional communication between the graphical interface and robots. Students without any prior programming experience can create graphical routines in virtual worlds using syntax that is easy to learn. The tool automatically converts these routines to programs for synchronous and asynchronous execution (and adaptive behavior) on multiple different robot platforms in the real−world. Furthermore, we have developed a curriculum that can be used with this tool to teach core concepts of computing, concurrent execution and real−world sensing to school students. Undergraduate students involved in this project will enhance the tool and the curriculum, and also participate in teaching the material to local school students and teachers.
Data Deduplication in High-Performance Computing
Data deduplication has been generally recognized as a critical technique that reduces the data volume to be stored on storage systems and is primarily used for backup store of existing high-performance computing (HPC), Cloud computing, and big data computing systems. The data explosion of scientific applications, however, poses a significant challenge for I/O (input/output) subsystem and primary storage of existing HPC systems. The movement of huge volume of data has been recognized as the bottleneck of computing. The goal of this project is to investigate innovative data deduplication methods to reduce the data movement for I/O operations with limited overheads. We expect to design an I/O deduplication framework based on existing work to support read/write operations and investigate the overhead. The primary summer work is to study and run open source single node deduplication file systems, including lessfs and Opendedup SDFS, analyze their designs, and evaluate their performance.
Fast Data Analysis for Big Data Applications
Many scientific computing and high-performance computing applications have become increasingly data intensive. Recent studies have started to utilize indexing, subsetting, and data reorganization to manage the increasingly large datasets. In this project, we intend to build an innovative fast data analysis framework for scientific big data applications. We have recently studied a Fast Analysis with Statistical Metadata (FASM) approach via data subsetting and integrating a small amount of statistics into the original datasets. The added statistical information illustrates the data shape and provides knowledge of the data distribution; therefore the original scientific libraries can utilize these statistical metadata to perform fast queries and analyses. We expect to further study subsetting, indexing, segmented analysis, and pre-analysis for reducing data movements and to enable fast data analysis for scientific big data applications. These concepts and ideas can potentially lead to new data analytics methodologies and can have an impact on scientific discovery productivity.