Dr. Neil Cooke is a Lecturer in Applied Computing. His school web pages are located at www.eee.bham.ac.uk/cooken
Neil’s research interests are in Multimodal Interaction technologies, Speech Recognition, Design Methods and prototyping Interactive 3D systems. He has developed and has taught/teaches modules in Interactive 3D Design, Project Management, Software Engineering and Artificial Intelligence.
• PGCert Learning and Teaching in Higher Education, 2011.
• PhD in Engineering 2007
• MSc with Distinction in Communications, Computers and Human Centred Systems, University of Birmingham 2002.
• BEng (hons) Electronic Engineering, UMIST 1993
Neil Cooke qualified with a BEng (Hons) in Electronic Engineering from the University of Manchester Institute of Science and Technology (UMIST) in 1993.
He went on to work in a number of industrial roles in software development and communication network engineering in the UK and in Germany. In 2002 he obtained an MSc in Communications, Computers and Human-Centred Systems and followed this with a PhD where he developed concepts behind multimodal speech recognition systems which utilise information from eye gaze.
In 2005 was appointed as a Lecturer in applied computing at the University of Birmingham. In this appointment he has developed his research interests in multimodal interaction technologies and supervised PhD students in this area. Latterly, he has developed prototypes in Interactive 3D (i3D) serious games for defence.
His research activities have informed his teaching in Interactive 3D Design, Artificial Intelligence, Multimodal Interaction, Software Engineering and Project Management.
• Advanced Interactive 3D Design for Virtual Environments and Serious Games
• Interactive 3D Design for Virtual Environments and Serious Games
• Artificial Intelligence
Multimodal information fusion centred on the use of information from gaze, speech and brain.
Measures of appropriateness for multimodal information fusion.
The serious game design process.
Cooke and Shen. "Mutual Information as a variable to differentiate the roles of gaze in the multimodal interface". Presented at the Workshop on Eye Gaze in Intelligent Human Machine Interaction at the 2010 International Conference on Intelligent User Interfaces (2010).
Cooke and Russell. "Cache-based language model adaptation using visual attention for ASR in meeting scenarios". Proceedings of the 2009 international conference on Multimodal interfaces ICMI/MLMI09. 87-90 (2009).
Cooke and Russell. "Gaze-Contingent Automatic Speech Recognition". IET Journal Signal Processing 2-4 369-380 (2008).
Cooke and Russell. "Gaze-Contingent ASR for Spontaneous, Conversational Speech: an Evaluation". International Conference in Acoustics, Speech and Signal Processing (ICASSP 2008).
Bull, Cooke and Mabbott. "Visual Attention in Open Learner Model Presentations: An Eye-Tracking Investigation". The 11th International Conference on User Modeling (UM 2007).
Cooke, "Gaze-Contingent Automatic Speech Recognition" . PhD Thesis. University of Birmingham 2006.
Cooke and Russell. "Using the focus of visual attention to improve automatic speech recognition". INTERSPEECH '2005 - 9th European Conference on Speech Communication and Technology, Lisbon, Portugal.
Cooke, Russell, Meyer "Evaluation of hidden Markov model robustness in uncovering focus of visual attention from noisy eye-tracker data" Proceedings ETRA2004 Eye Tracking Research & Applications Symposium. San Antonio, Texas March 22-24, 2004. ACM ISBN:1-58113-825-3. (for more details on this work see PhD thesis).