My research interests focus broadly around computer vision and image analysis, applied across a number of domains. Historically, I’ve worked in agri-tech, security and consumer robotics, amongst others.
Journal Publications
- Bernotas G., et. al. (2019). A photometric stereo-based 3D imaging system using computer vision and deep learning for tracking plant growth. GigaScience, Volume 8, Issue 5, May 2019. (4th Author)
- Smith, L. N., et. al. (2018). Innovative 3D and 2D machine vision methods for analysis of plants and crops in the field. Computers in Industry, 97. pp. 122-131. ISSN 0166-3615. (4th Author)
Conference Publications
- Blackwell, A.F., et.al. (2018).Computer says ‘don’t know’ – interacting visually with incomplete AI models. In S. Tanimoto, S. Fan, A. Ko and D. Locksa (Eds), Proceedings of the Workshop on Designing Technologies to Support Human Problem Solving. University of Washington. pp. 5-14. (3rd Author)
- Hales, I.J. (2015). A Low-Cost, Low-Power System for 3D Plant Imaging. Poster at PhenoDays, Freising, Germany.
- Hales, I. J., et. al. (2015). Long-range concealed object detection through active covert illumination, Proc. SPIE 9648, Electro-Optical and Infrared Systems: Technology and Applications XII; and Quantum Information Science and Technology, 964806 (13/10/2015)
- M. F. Hansen, M. L. Smith, L. Smith, I. J. Hales and D. Forbes (2015). Non-intrusive automated measurement of dairy cow body condition using 3D video. In Proceedings of the Machine Vision of Animals and their Behaviour (MVAB), pages 1.1-1.8. BMVA Press, September 2015.
- Hales, I. J., Hansen, M. F., Broadbent, L., and Smith, M. L. (2014). A Modular Capture System for Automatic Weed Imaging and Analysis. Poster presented at the International Workshop on Image Analysis Methods for the Plant Sciences in Aberystwyth, Wales.
- Hales, I. J., Hogg, D. C., Boyle, R. D. and Ng, K. C. (2013). Automated Ground-Plane Estimation for Trajectory Rectification. In Computer Analysis of Images and Patterns, vol. 8048 of Lecture Notes in Computer Science, pp. 378-385.
- Hales, I. J., Boyle, R. D. and Ng, K. C., (2010). An Unsupervised Approach to Anonymous Crowd Monitoring. BMVA technical Meeting on Aerial Image Analysis & Classification. London, UK
Thesis
- Hales, I. J. (2014) Ground plane rectification from crowd motion. PhD thesis, University of Leeds.
Invited Talks
- Hales, I. J. (2015) Precision Agriculture: Working in the field (and above it) with computer vision. Invited talk at the Dronetech Bristol conference
- Hales, I. J. (2015). Seeing the Wood for the Trees (Or the Crops for the Weeds) – 3D machine vision for agriculture and beyond. Joint Seminar for Computer Science and IBERS at Aberystwyth University.
PhD Students
- Dr. Gytis Bernotas, thesis entitled “3D Plant Phenotyping of Arabidopsis thaliana Using Photometric Stereo”. Passed, subject to minor amendments May 17th 2019.
Bidding/Funding
Whilst not eligible to be PI due to my fixed-term contract, I took a pivotal role in applying for the following successful bids whilst at UWE:
- Innovate UK Agri-Tech Catalyst Round 5: GrassVision – Automated application of herbicides to broad-leaf weeds in grass crops (69482-447330, Granted 2015), collaboration with SoilEssentials Ltd. Total value: £114,548; £57,274 for UWE, matched in kind by SoilEssentials.
- BBSRC TRDF 1: 4D-HSI-4-Free: Integrating Sensors & Vision Process Engineering to Deliver a Tool for revealing 3D Hyperspectral Agri-data from 2D Images (BB/N02107X/1, granted 2016), collaboration with the University of Edinburgh. Total value £149,646; £74,823 for UWE.
- BBSRC TRDF 2: An affordable stereoscopic camera array system for capturing real-time 3D responses to vegetation dense environments (BB/N02334X/1, granted June 2016), collaboration with the University of Manchester. Total value £151,637; £75,818 for UWE.
Patent Applications
A crop monitoring system and method
GB2558029A: Filed Jul 4, 2018 [Approved, Link]
A harvester monitoring system configured to determine one or more parameters associated with harvested items (21/22, fig 3), the system comprising: a camera module (111, fig 9) having a field of view and configured to generate image data associated with the harvested items; a mounting bracket (113, fig 7) configured to secure the camera module to a harvester (100, fig 1) such that a conveyor (108, fig 1) of the harvester is within the field of view of the camera module; a location sub-system 14 configured to determine and output location data representative of a geographical location of the harvester monitoring system; and a processing unit configured to receive the image data and the location data, to determine one or more parameters associated with the harvested items, and to record the one or more parameters in association with the location data on a computer readable medium; wherein the camera module is configured to determine depth data for the harvested items. Also disclosed is a monitoring method.
Break Analysis Apparatus and Method
GB2551894A: Filed May 1, 2017 [Pending: Link]
WO/2017/194950: Filed May 1, 2017 [Pending: Link]
A method, system and device are disclosed which enable the analysis of a break/crack in a vehicle glazing panel without the attendance of a technician. The driver of the vehicle photographs the break/crack using their mobile phone and this image is transmitted to an image processing module and then to a break analysis module, s400, which first isolates the crack from the background, s408, by application of a Fourier transform, s402, to the image followed by a Butterworth bandpass filter, s404 and then an inverse Fourier transform process, s406. A morphology routine, s410 is then applied to refine the image and the break analysis module determines the size of the crack and based on this it is determined whether the crack needs attention, s412 and this result is transmitted to the driver.