Project Overview
Much of the functionalities in our daily lives are software controlled and hence protecting our software against security vulnerabilities is of extreme importance. A common source of vulnerabilities comes from the input to the software, which may not be checked within the application. These vulnerabilities take different forms and names such as cross-site scripting, SQL injection and so on. In particular cross-site scripting allows attackers to pass unchecked input in the form of problematic scripts which may then get executed on the site of a non-malicious user. SQL injection refers to unchecked program input being used to construct database queries (which may then be exploited by an attacker to reveal confidential information such as user passwords).
In this project, we propose to develop and employ analysis methods for detecting impact of program inputs on (parts of) an application. The main purpose is to detect or explain potential software attacks – thereby enhancing software security. One of the innovative outputs of the project will be to use software analysis and symbolic execution methods for generating and explaining potential attack scenarios, without actually encountering the attacks.
Our infrastructure will be geared to find out and summarize the input dependent parts of an application. It may suggest mechanisms to the programmer for making their applications more robust by inserting more checks at the appropriate places in their program. More interestingly, our analysis infrastructure can potentially reveal attack scenarios prior to the deployment of an application. This will be done by tracking the input propagation within an application, finding the input-dependent parts and summarizing them to the programmer.
As part of our work, we observed that often a software may come with a reference implementation. For example, web-servers implement the HTTP protocol - so for testing and debugging a specific web-server we can use a well-known and well-tested web-server such as Apache as the reference implementation. Based on this observation, our project has developed several test generation, debugging and validation methods, as evidenced by the relevant publications.
People
Faculty Members
- Abhik Roychoudhury (Principal Investigator)
- Liang Zhenkai (Co-Principal Investigator)
Post-doctoral Fellow
PhD Student
Research Assistants and Interns
- Hoang Thien (Intern)
- Sun Tao (RA)
Publications
-
Test Generation to Expose Changes in Evolving Programs [ PDF ]
Dawei Qi, Abhik Roychoudhury, Zhenkai Liang
25nd IEEE/ACM International Conference on Automated Software Engineering (ASE) 2010.
- Golden Implementation Driven Software
Debugging [ PDF ]
Ansuman Banerjee, Abhik Roychoudhury, Johannes A. Harlie, Zhenkai Liang
International Symposium on Foundations of Software Engineering (FSE) 2010.
- DARWIN: An Approach for Debugging Evolving Programs
[PDF], Dawei Qi, Abhik Roychoudhury, Zhenkai Liang, Kapil Vaswani.
ESEC and ACM SIGSOFT Symposium on the Foundations of Software Engineering (FSE), ESEC-FSE, Amsterdam, the Netherlands, August 2009.(ACM SIGSOFT Distinguished paper award)
-
Debugging as a Science, that too, when your Program is Changing ( pdf )
Abhik Roychoudhury
Keynote at the Intl. Workshop on Harnessing Theories for Tool Support in Software (TTSS) 2009, to appear in Electronic Notes in Theoretical Computer Science (ENTCS) in 2010.
Funding
This project is funded by the Defence Research and Technology Office (DRTech) Singapore for a period of three years (2009-2012). This support is gratefully acknowledged.