MSc Computer Science - PROM05: Planning & Evaluating Research
January 30, 2026
Planning and Evaluating Research
As I meet the midway point through the planning phase of my dissertation, I have been reflecting on how project management and research evaluation will need to be tailored for my project, looking at Deepfake Detection.
This research project field sits at the intersection of AI, machine learning, and cybersecurity, where technological developments evolve at an increasingly rapid speed, and research must adapt in the same way and speed to any emerging challenges. Conducting research in a dynamic space like this requires careful planning, structured evaluation, and flexible methodologies to ensure meaningful outcomes.
To manage the project effectively, I am combining Agile-style task management with milestone planning. Breaking the work into stages, for example, dataset preparation, model testing, interface design, and user evaluation, will allow me to monitor progress, anticipate risks, and adapt to any unexpected technical or ethical challenges. Utilising tools such as GitHub Projects will support this approach by providing task tracking, version control, and a visual overview of the research workflow. This element of my setup is particularly important in areas where iterative experimentation is essential, and prototype development will most definitely involve multiple revisions.
The research evaluation approach must also reflect the dual focus of my project: technical performance and user experience.
As previously discussed, Quantitative measures, such as detection accuracy, processing time, false positives, and resource usage, will provide objective evidence of how well the prototype identifies manipulated media under realistic conditions. At the same time, Qualitative analysis examines the broader context: the threat landscape, any ethical considerations, privacy concerns, and user trust. Usability testing, questionnaires, and descriptive feedback will provide insights into whether the system supports informed decision-making and enhances digital safety.
Research at the forefront of AI and cybersecurity requires a combination of rigour, a huge amount of adaptability and flexibility, as well as reflection. By integrating robust project management with a mixed-methods evaluation, I am aiming to ensure that the prototype solution is not only technically effective but also socially and ethically relevant. This approach highlights how research methods can be tailored to tackle emerging challenges in rapidly developing fields, while still producing credible and actionable outcomes.
References
Creswell, J.W. and Plano Clark, V.L. (2018). Designing and Conducting Mixed Methods Research. 3rd edn. Thousand Oaks, CA: Sage.
Sokolova, M. and Lapalme, G. (2009). A systematic analysis of performance measures for classification tasks. Information Processing & Management. Available at: https://www.sciencedirect.com/science/article/pii/S0306457309000259 (Accessed 30 Jan. 26).