Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
SEX BASED ARTHROPOD SORTING SYSTEM AND METHOD
Document Type and Number:
WIPO Patent Application WO/2023/197042
Kind Code:
A1
Abstract:
A sex based arthropod sorting system and method is described which preferentially collects individuals of a selected sex in a final collection module. The system comprises a behavioural sorting module, a computer vision based tracking and targeting module and a final collection chamber with an associated air flow system that controls the direction of air flow through the system to provide one-way movement. The system may also include a rearing module, a sieving module and an emergence collection container which is connected to the behavioural sorting module. The behavioural sorting module uses behavioural differences between the sexes to preferentially collect the selected individuals. An integrated air flow system controls the direction of air flow through the system to provide one-way movement including though a chamber of a computer vision based tracking and targeting module. The sex of the individuals is recognised, and a directed energy beam, such as a laser, is used to kill or sterilise individuals of the non- selected sex (the target sex), whilst allowing individuals of the selected sex to pass through the chamber and into the final collection chamber.

Inventors:
TREWIN BRENDAN (AU)
WANG XIAOBEI (AU)
GENSEMER STEPHEN (AU)
GOLEBIEWSKI MACIEJ (AU)
Application Number:
PCT/AU2023/050307
Publication Date:
October 19, 2023
Filing Date:
April 14, 2023
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
COMMW SCIENT IND RES ORG (AU)
International Classes:
G06V40/10; A01K67/033; A01M1/06; A01M1/08; A01M1/22; G06T7/20; G06T7/70; G06V10/82; G06V20/70
Foreign References:
US20200281164A12020-09-10
US20180206473A12018-07-26
CN102726358A2012-10-17
US20220053743A12022-02-24
US20200154685A12020-05-21
Other References:
MULLEN EMMA R., RUTSCHMAN PHILLIP, PEGRAM NATHAN, PATT JOSEPH M., ADAMCZYK JOHN J., JOHANSON: "Laser system for identification, tracking, and control of flying insects", OPTICS EXPRESS, vol. 24, no. 11, 30 May 2016 (2016-05-30), pages 11828, XP093102552, DOI: 10.1364/OE.24.011828
RAKHMATULIN ILDARR: "Raspberry PI for Kill Mosquitoes by Laser", MEDIUM, 9 March 2021 (2021-03-09), XP093102553, Retrieved from the Internet [retrieved on 20231116]
PODA SERGE B., NIGNAN CHARLES, GNANKINÉ OLIVIER, DABIRÉ ROCH K., DIABATÉ ABDOULAYE, ROUX OLIVIER: "Sex aggregation and species segregation cues in swarming mosquitoes: role of ground visual markers", PARASITES & VECTORS, vol. 12, no. 1, 1 December 2019 (2019-12-01), XP093102554, DOI: 10.1186/s13071-019-3845-5
KOHLHOFF KAI J., JAHN THOMAS R., LOMAS DAVID A., DOBSON CHRISTOPHER M., CROWTHER DAMIAN C., VENDRUSCOLO MICHELE: "The iFly tracking system for an automated locomotor and behavioural analysis of Drosophila melanogaster", INTEGRATIVE BIOLOGY, RSC PUBL., CAMBRIDGE, vol. 3, no. 7, 1 January 2011 (2011-01-01), Cambridge , pages 755, XP093102555, ISSN: 1757-9694, DOI: 10.1039/c0ib00149j
CHEN CHING-HSIN, CHIANG ANN-SHYN, TSAI HUNG-YIN: "Three-Dimensional Tracking of Multiple Small Insects by a Single Camera", JOURNAL OF INSECT SCIENCE, vol. 21, no. 6, 1 November 2021 (2021-11-01), XP093102558, DOI: 10.1093/jisesa/ieab079
SAVALL JOAN, HO ERIC TATT WEI, HUANG CHENG, MAXEY JESSICA R, SCHNITZER MARK J: "Dexterous robotic manipulation of alert adult Drosophila for high-content experimentation", NATURE METHODS, NATURE PUBLISHING GROUP US, NEW YORK, vol. 12, no. 7, 1 July 2015 (2015-07-01), New York, pages 657 - 660, XP093102560, ISSN: 1548-7091, DOI: 10.1038/nmeth.3410
Attorney, Agent or Firm:
MADDERNS PTY LTD (AU)
Download PDF:
Claims:
CLAIMS

1. A computer vision based three-dimensional tracking and targeting method for sex based sorting of arthropods, the method comprising: capturing a stream of video images either from each of at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of a targeting chamber, or from a single image sensor and a mirror arrangement configmed such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of a targeting chamber, wherein a plurality of arthropods move through the targeting chamber from an entry aperture to an exit aperture and an air flow system is configured to direct air flow through the targeting chamber; tracking a three dimensional position of one or more arthropods using the stream of video images; determining a sex of an arthropod by identifying the arthropod in at least one image in the stream of video images and using a trained machine learning classifier to estimate the sex of the arthropod; predicting a target position at a future time of an arthropod classified as a target sex using the tracked three dimensional positions of the arthropod; and firing a directed energy beam at the target position at the future time.

2. The method as claimed in claim 1 further comprising generating a tracking stream of images and generating a classification stream of images from the stream of video images wherein each of the tracking streams are synchronised and each image in the tracking stream is down converted to a predefined lower resolution image size for tracking, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.

3. The method as claimed in claim 1 wherein the at least two image sensors comprises at least four image sensors divided into two sets of at least two image sensors, wherein the first set is used to generate a tracking stream of images and the second set is used to generate a classification stream of images, and the stream of images from each of the image sensors in the first set are synchronised, and images in the tracking stream of a lower resolution than the images in the classification stream, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.

4. The method as claimed in claim 2 or 3 wherein tracking the three dimensional position of one or more arthropods comprises: generating a first subtracted image captured at a first time by comparing a first image at the first time with at least the previous frame from a first tracking stream of images; identifying one or more arthropods in the first subtracted image and estimating a first position for each identified arthropod; generating a second subtracted image captured at the first time by comparing a second image at the first time with at least the previous frame from the second tracking stream of images; identifying one or more arthropods in the second subtracted image and estimating a second position for each identified arthropod; determining a three dimensional position for each identified arthropod by using a ray-tracing method that uses the position of the first sensor and the first position to generate a first ray, and the position of the second sensor and the second position to generate a second ray, and the three dimensional position is determined based on identifying an intersection point of the first ray and the second ray within the targeting chamber; determining if the identified arthropod exists in a tracking database, and if so obtaining an identifier for the identified arthropod otherwise registering a new identifier for the identified arthropod; and storing the three dimensional position of the arthropod, and the associated order, time and the arthropod identifier in the tracking database.

5. The method as claimed in claim 4 wherein determining a sex of an arthropod comprises: identifying one or more arthropods in each image in the classification stream of images and generating a candidate arthropod image by cropping the image to a predetermined classification image size around an identified arthropod; and providing each respective candidate arthropod image to the trained machine learning classifier and obtaining an estimate of the sex of the arthropod and storing the estimate of the sex of the arthropod with the identifier of the arthropod from the tracking database.

6. The method as claimed in any one of claims 1 to 5 wherein the directed energy beam is a laser and mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled to direct the laser beam at the predicted target position of the identifier of the arthropod wherein the estimated sex is matched with a target sex from the tracking database, and the laser is fired at the target position at the future time.

7. The method as claimed in any one of claims 1 to 6, wherein the targeting chamber is illuminated by a lighting system which is provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor.

8. The method as claimed in any one of claims 1 to 7, further comprising behaviourally sorting the arthropods using one or more sequential sorting chambers, each having an exit aperture and providing the behaviourally sorted arthropods to the entry aperture of the targeting chamber, wherein the one or more sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit chamber of each sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential sorting chambers.

9. The method as claimed in claim 8, wherein at least one of the one or more lures is an audio lure or an optical lure located adjacent the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective lure is switched on for a lure time period and when the respective lure is switched on the air flow system is configmed to provide negative air flow to suck air through the exit aperture from the behavioural sorting chamber so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more behavioural sorting chambers.

10. The method as claimed in claim 9, wherein when the respective hire is switched off the air flow system is configmed to provide zero or positive air flow into the exit aperture to prevent lured arthropods exiting the behavioural sorting chamber.

11. The method as claimed in claim 10, wherein each of the at least one lure located adjacent the exit aperture is an audio lure which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.

12. The method as claimed in claim 8, wherein the one or more lures comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.

13. The method as claimed in any one of claims 8 to 12, wherein each behavioural sorting chamber comprises at least one lure within the chamber and each chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the chamber is mounted to the swarm marker structure.

14. The method as claimed in claim 13, wherein the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the sorting chamber which is connected to a containment structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base wall of the behavioural sorting chamber.

15. The method as claimed in any one of claims 8 to 14, further comprising sieving pupae through a sieving apparatus and into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first behavioural sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.

16. A sex based arthropod sorting system comprising: a targeting chamber comprising an entry aperture and an exit aperture; an air flow system configmed to direct air flow through the chamber from the entry aperture to the exit aperture; at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of the targeting chamber, or a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber; a directed energy system comprising a directed energy source and a targeting apparatus for directing a directed energy beam at a target position within the targeting chamber; and at least one processor and at least one memory, wherein the at least one memory comprises instructions for configuring the at least one processor to perform the computer vision tracking and targeting method for sex based sorting of arthropods according to any one of claims 1 to 5.

17. The sex based arthropod sorting system as claimed in claim 16 wherein the directed energy source is a laser, the directed energy beam is a laser beam and the targeting apparatus is a mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled by the at least one processor to direct the laser beam at the predicted target position, and the laser is fired at the future time.

18. The sex based arthropod sorting system as claimed in claim 16 or 17 further comprising a lighting system comprising a pair of translucent walls provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor and which are illuminated by a lighting panel behind each translucent wall.

19. The sex based arthropod sorting system as claimed in any one of claims 16 to 18, further comprising one or more sequential behavioural sorting chambers, each having an exit aperture and the exit aperture of a last sorting behavioural chamber is connected to the entry aperture of the targeting chamber, wherein the one or more behavioural sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit aperture of each behavioural sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential behavioural sorting chambers.

20. The sex based arthropod sorting system as claimed in claim 19, wherein at least one of the one or more lures is an audio hire or an optical lure located adjacent the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective hire is switched on for a lure time period and when the respective hire is switched on the air flow system is configured to provide negative air flow to suck air through the exit aperture from the behavioural sorting chamber so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more sorting chambers.

21. The sex based arthropod sorting system as claimed in claim 20, wherein when the respective hire is switched off, the air flow system is configured to provide zero or positive air flow into the exit aperture to prevent lured arthropods exiting the behavioural sorting chamber.

22. The sex based arthropod sorting system as claimed in claim 21, wherein each of the at least one lure located adjacent the exit aperture is an audio lure which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.

23. The sex based arthropod sorting system as claimed in claim 19, wherein the one or more hires comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.

24. The sex based arthropod sorting system as claimed in any one of claims 19 to 23, wherein each behavioural sorting chamber comprises at least one lure within the chamber and each behavioural sorting chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the behavioural sorting chamber is mounted to the swarm marker structure.

25. The sex based arthropod sorting system as claimed in claim 24, wherein the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the behavioural sorting chamber which is connected to a base structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base wall of the behavioural sorting chamber.

26. The sex based arthropod sorting system as claimed in any one of claims 16 to 25, further comprising a sieving apparatus which is configured to sieve pupae into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.

27. The sex based arthropod sorting system as claimed in any one of claims 16 to 26, further comprising a collection chamber connected to the exit aperture of the targeting chamber to collect arthropods of the non-target sex after traversing the targeting chamber.

Description:
SEX BASED ARTHROPOD SORTING SYSTEM AND METHOD

PRIORITY DOCUMENTS

[0001] The present application claims priority from Australian Provisional Patent Application No. 2022901010 titled “SEX BASED ARTHROPOD SORTING SYSTEM AND METHOD” and filed on 14 April 2022, the content of which is hereby incorporated by reference in its entirety.

TECHNICAL FIELD

[0002] The present disclosure relates to sex sorting of arthropods. In a particular form the present disclosure relates to a computer vision based tracking and targeting system for sex based sorting of arthropods.

BACKGROUND

[0003] Vector borne diseases are a leading cause of mortality and morbidity throughout tropical regions. For example, the Aedes aegypti and Aedes albopictus mosquito species are both vectors for the dengue, chikungunya, Zika and yellow fever viruses. Many Anopheles species of mosquito are vectors for malaria. Other arthropods species such as the Mediterranean fruit fly (( 'eralilis capitata), moths, and beetles represent threats to agriculture. There is thus great interest in controlling populations of many arthropod species.

[0004] One approach that has been successfully trialled in mosquitoes is the incompatible (or sterile) insect technique (IIT or SIT). This involves rearing a population of sterile male mosquitoes which are released to mate with wild females. These females then lay infertile eggs and over several months the mosquito population is suppressed, and then eliminated, thus removing all mosquito-borne disease transmission. Both the IIT and SIT methods have been used successfully over many years in different insect systems. The males may be rendered infertile through various means. In the IIT system males are infected with the Wolbachia bacteria. Wolbachia bacteria naturally occur in many arthropod species (~60%) and it has been discovered that some newly established infections prevent (or at least significantly suppresses) replication of virus in infected mosquitoes. Further, when males infected with a Wolbachia bacteria strain mate with wild females that don’t contain the same Wolbachia strain or contain a different Wolbachia strain to the males, the resultant eggs are infertile in a process called cytoplasmic incompatibility. In other approaches males may be rendered infertile though genetic engineering, chemical or radiation treatments. [0005] A problem with such IIT (or SIT) systems is that they require robust mass rearing of mosquitoes and highly accurate sorting of females from males. For example, one trial in Innisfail, Queensland, involved rearing and releasing three million infertile (Wolbachia infected) male mosquitoes. This process requires high accuracy sorting of males from females to prevent (or minimise) contamination of female mosquitoes infected with the same strain of Wolbachia. These Wolbachia infected females may still mate with wild male mosquitoes generating viable offspring. For example, it is desirable that sorting (or contamination) rates of the order of 1 in 100,000 and preferably 1 in 1 million or 1 in 10 million are achieved. It is also desirable that the sorting system does not harm the male mosquitoes passing through the system to ensure they are as fit as possible when released, to ensure they can compete with wild male mosquitoes and successfully mate with wild female mosquitoes.

[0006] A problem with existing sorting systems is that they are often manual, have low sorting accuracy (i.e., high contamination), are slow to sort sexes, impact male fitness and/or have high wastage rates (e.g., many dead pupae or adult male mosquitoes).

[0007] There is thus a need to provide improved methods and systems for sorting arthropods based on the sex, or to at least provide a useful alternative to existing systems.

SUMMARY

[0008] According to a first aspect, there is provided a computer vision based three-dimensional tracking and targeting method for sex based sorting of arthropods, the method comprising: capturing a stream of video images either from each of at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of a targeting chamber, or from a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of a targeting chamber, wherein a plurality of arthropods move through the targeting chamber from an entry aperture to an exit aperture and an air flow system is configured to direct air flow through the targeting chamber; tracking a three dimensional position of one or more arthropods using the stream of video images; determining a sex of an arthropod by identifying the arthropod in at least one image in the stream of video images and using a trained machine learning classifier to estimate the sex of the arthropod; predicting a target position at a future time of an arthropod classified as a target sex using the tracked three dimensional positions of the arthropod; and firing a directed energy beam at the target position at the future time. [0009] In one form, the method further comprises generating a tracking stream of images and generating a classification stream of images from the stream of video images wherein each of the tracking streams are synchronised and each image in the tracking stream is down converted to a predefined lower resolution image size for tracking, and, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.

[0010] In one form, the at least two image sensors comprises at least four image sensors divided into two sets of at least two image sensors, wherein the first set is used to generate a tracking stream of images and the second set is used to generate a classification stream of images, and the stream of images from each of the image sensors in the first set are synchronised, and images in the tracking stream of a lower resolution than the images in the classification stream, and the step of tracking a three dimensional position is performed using the lower resolution tracking stream of images and the step of determining a sex of an arthropod is performed using the classification stream of images by identifying one or more arthropods in an image in the classification stream of images and using a trained machine learning classifier to estimate the sex of each of the identified one or more arthropods, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel.

[0011] In one form, tracking the three dimensional position of one or more arthropods comprises: generating a first subtracted image captured at a first time by comparing a first image at the first time with at least the previous frame from a first tracking stream of images; identifying one or more arthropods in the first subtracted image and estimating a first position for each identified arthropod; generating a second subtracted image captured at the first time by comparing a second image at the first time with at least the previous frame from the second tracking stream of images; identifying one or more arthropods in the second subtracted image and estimating a second position for each identified arthropod; determining a three dimensional position for each identified arthropod by using a ray-tracing method that uses the position of the first sensor and the first position to generate a first ray, and the position of the second sensor and the second position to generate a second ray, and the three dimensional position is determined based on an identifying an intersection point of the first ray and the second ray within the targeting chamber; determining if the identified arthropod exists in a tracking database, and if so obtaining an identifier for the identified arthropod otherwise registering a new identifier for the identified arthropod; and storing the three dimensional position of the arthropod, and the associated order, time and the arthropod identifier in the tracking database.

[0012] In a further form, determining a sex of an arthropod comprises: identifying one or more arthropods in each image in the classification stream of images and generating a candidate arthropod image by cropping the image to a predetermined classification image size around an identified arthropod; and providing each respective candidate arthropod image to the trained machine learning classifier and obtaining an estimate of the sex of the arthropod and storing the estimate of the sex of the arthropod with the identifier of the arthropod from the tracking database.

[0013] In one form, the directed energy beam is a laser and mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled to direct the laser beam at the predicted target position of the identifier of the arthropod wherein the estimated sex is matched with a target sex from the tracking database, and the laser is fired at the target position at the future time.

[0014] In one form, the targeting chamber is illuminated by a lighting system which is provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor.

[0015] In one form, the method further comprises behaviourally sorting the arthropods using one or more sequential behavioural sorting chambers, each having an exit aperture and providing the behaviourally sorted arthropods to the entry aperture of the targeting chamber, wherein the one or more sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit chamber of each behavioural sorting chamber based on a type of hire to lure arthropods of the non-target sex through the one or more sequential behavioural sorting chambers.

[0016] In a further form, at least one of the one or more lures is an audio lure or an optical lure located adjacent to the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective lure is switched on for a lure time period and when the respective hire is switched on the air flow system is configmed to provide negative air flow to suck air through the exit aperture so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more behavioural sorting chambers.

[0017] In a further form, when the respective lure is switched off, the air flow system is configured to provide zero or positive air flow into the exit aperture to prevent hired arthropods exiting the behavioural sorting chamber.

[0018] In a further form, each of the at least one lure located adjacent the exit aperture is an audio hire which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.

[0019] In a further form, the one or more lures comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.

[0020] In one form, each behavioural sorting chamber comprises at least one hire within the chamber and each chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the chamber is mounted to the swarm marker structure.

[0021] In a further form, the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the sorting chamber which is connected to a containment structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base a wall of the behavioural sorting chamber.

[0022] In one form, the method further comprises sieving pupae through a sieving apparatus and into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first behavioural sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.

[0023] According to a second aspect, there is provided a sex based arthropod sorting system comprising: a targeting chamber comprising an entry aperture and an exit aperture; an air flow system configured to directs air flow through the chamber from the entry aperture to the exit aperture; at least two image sensors wherein the at least two image sensors are spaced apart and each have a respective field of view that share an overlapping region including at least a portion of the targeting chamber, or a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber; a directed energy system comprising a directed energy source and a targeting apparatus for directing a directed energy beam at a target position within the targeting chamber; and at least one processor and at least one memory, wherein the at least one memory comprises instructions for configuring the at least one processor to perform the computer vision tracking and targeting method for sex based sorting of arthropods according to the first aspect.

[0024] In one form, the directed energy source is a laser, the directed energy beam is a laser beam and the targeting apparatus is a mirror galvanometer system wherein the orientation of the mirror galvanometer is controlled by the at least one processor to direct the laser beam at the predicted target position, and the laser is fired at the future time.

[0025] In one form, the method further comprises a lighting system comprising a pair of translucent walls provided on each side of the targeting chamber distal from an image sensor to backlight the targeting chamber with respect to the image sensor and which are illuminated by a lighting panel behind each translucent wall.

[0026] In one form, the method further comprises one or more sequential behavioural sorting chambers, each having an exit aperture and the exit aperture of a last sorting behavioural chamber is connected to the entry aperture of the targeting chamber, wherein the one or more behavioural sorting chambers comprise: one or more lures wherein the air flow system is controlled to direct air into or out of the exit aperture of each behavioural sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential behavioural sorting chambers.

[0027] In a further form, at least one of the one or more lures is an audio lure or an optical lure located adjacent the exit aperture of a behavioural sorting chamber and is repeatedly switched on and off, wherein the respective lure is switched on for a lure time period and when the respective lure is switched on the air flow system is configured to provide negative air flow to suck air through the exit aperture from the chamber so as to suck lured arthropods out of the sorting chamber, and the air is then either directed into a next behavioural sorting chamber or into the entry aperture of the targeting chamber if the behavioural sorting chamber is a last behavioural sorting chamber in a sequence of the one or more sorting chambers. [0028] In a further form, when the respective lure is switched off, the air flow system is configured to provide zero or positive air flow into the exit aperture to prevent hired arthropods exiting the behavioural sorting chamber.

[0029] In a further form, each of the at least one lure located adjacent the exit aperture is an audio hire which broadcasts an audio signal at a frequency or frequency range determined from a wingbeat frequency range of the target sex.

[0030] In a further form, wherein the one or more lures comprises at least one chemical attractant lure for the non-target sex located in a collection chamber after the targeting chamber, and the air flow system is configured to direct air through the targeting chamber and into each of the one or more behavioural sorting chambers via the associated exit aperture.

[0031] In one form, each behavioural sorting chamber comprises at least one hire within the chamber and each behavioural sorting chamber further comprises: a swarm marker structure comprising a first colour portion comprising the exit aperture and a second colour portion surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour, and the at least one lure within the behavioural sorting chamber is mounted to the swarm marker structure.

[0032] In one form, the swarm marker is a tubular structure with a black or dark colour that projects into the behavioural sorting chamber from a base wall of the behavioural sorting chamber which is connected to a base structure with a white or light colour, wherein the exit aperture is located at a distal end of the tubular structure with respect to the base a wall of the behavioural sorting chamber.

[0033] In one form, the system further comprises a sieving apparatus which is configured to sieve pupae into a collection container, wherein the sieve apparatus comprises a plurality of apertures with a shape and size configured to preferentially allow pupae with a first sex to pass through the sieving apparatus and to retain pupae of a target sex, wherein the collection container is further connected to a first sorting chamber of the at least one sorting chamber to receive a plurality of adult arthropods emerging from the sieved pupa.

[0034] In one form, the system further comprises a collection chamber connected to the exit aperture of the targeting chamber to collect arthropods of the non-target sex after traversing the targeting chamber. BRIEF DESCRIPTION OF DRAWINGS

[0035] Embodiments of the present disclosure will be discussed with reference to the accompanying drawings wherein:

[0036] Figure 1 A is a schematic diagram of a modular system for sex based sorting of arthropods according to an embodiment;

[0037] Figure IB is a flow chart of a method for sex sorting of arthropods using a computer vision system for tracking and targeting of arthropods with a target sex according to an embodiment;

[0038] Figure 2 is a schematic diagram of a computer vision tracking and targeting method for sex based sorting of arthropods according to an embodiment;

[0039] Figure 3 A is an exploded view of a targeting chamber according to an embodiment;

[0040] Figure 3B shows a pair of images captured by a pair of orthogonal cameras with approximately orthogonal views of the targeting chamber according to an embodiment;

[0041] Figure 3C is a schematic view illustrating the relative geometry of each camera with respect to the target chamber and ray traces used for estimating positions of mosquitoes according to an embodiment;

[0042] Figure 4 A is a schematic illustration of generation of a subtracted image according to an embodiment;

[0043] Figure 4B is a schematic illustration of an Xception architecture and associated input, middle layer and output images according to an embodiment;

[0044] Figure 5A is a visualization of prediction error rates relative to mosquito movement within the target chamber where darker grey colours represents an increase in error (inaccuracy) relative to the velocity of mosquito movement, measured as the difference in mm between two video frames at 30fps according to an embodiment;

[0045] Figure 5B is a plot showing a linear relationship between distance moved between video frames (x axis) and the predicted location of the mosquito to be targeted by the laser (y axis) according to an embodiment; [0046] Figure 5C is a plot showing the Mean and Standard Deviation in target prediction inaccuracy compared to mosquito movement between video frames according to an embodiment;

[0047] Figure 5D is a schematic diagram of delays within the system when targeting a single mosquito with the laser tracking and targeting according to an embodiment;

[0048] Figure 6A is a plot showing the proportion of total male (black) and female (white) mosquitoes emerging since time of hatching according to an embodiment;

[0049] Figure 6B is a representation of a sieve according to an embodiment;

[0050] Figure 7 is a representation of a swarm marker according to an embodiment;

[0051] Figure 8A is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment;

[0052] Figure 8B is schematic diagram of the internal frame and chamber of the modular system shown in Figure 8A;

[0053] Figure 9A is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment;

[0054] Figure 9B is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment;

[0055] Figure 9C is schematic diagram of a modular integrated sex sorting apparatus according to one embodiment;

[0056] Figure 10 is a schematic diagram of a chamber incorporating a curved rear wall and curved lighting panel according to an embodiment; and

[0057] Figure 11 is schematic diagram of a single camera system incorporating a mirror according to one embodiment.

[0058] In the following description, like reference characters designate like or corresponding parts throughout the figures. DESCRIPTION OF EMBODIMENTS

[0059] Referring now to Figure 1 A, there is shown a sex based arthropod sorting system 1 and Figure IB is a flow chart of a method for sex sorting of arthropods using a computer vision system for tracking and targeting of arthropods with a target sex according to an embodiment. The sex based arthropod sorting system 1 is a modular system comprising a behavioural sorting module 40, a computer vision based tracking and targeting module 70 including a computing apparatus 60 and directed energy module 80, and a final collection module 98. In this embodiment the system also includes a rearing module 10, a sieving module 20, a collection (emergence) container 30 which is connected to the behavioural sorting module 40. However, in other embodiments arthropods could be collected into a collection container using other collection apparatus such as insect traps and connected or transferred to the behavioural sorting module 40. An integrated air flow system controls the direction of air flow through the system (and various components) to direct or assist one-way movement of arthropods through the system. The different modules could each be provided separately, or various combinations of modules may be provided as components for a sex based sorting system, for example the rearing module 10, the sieving module 20, and collection (emergence) container 30 may be provided in an integrated arthropod supply apparatus, and/or the behavioural sorting module 40, computer vision based tracking and targeting module 70 and final collection module 98 provided in a sex sorting apparatus, or a complete system comprising all modules may be provided. For ease of explanation, we consider an embodiment in which the arthropods are mosquitoes, and we wish to collect male mosquitoes and exclude female mosquitoes. In this embodiment males are the selected sex or the collected sex (i.e., the sex to be collected or retained) and females are the target sex (the sex to be sorted/excluded and targeted by the directed energy source). It will also be understood that the system could be used with other arthropods including flies, moths and beetles.

[0060] A rearing module 10 is used to rear larvae to the pupal stage for subsequent sieving. This rearing module controls the temperature and humidity and provides food sources for larvae. The rearing module may be configured to implement a standard rearing protocol. In one embodiment the rearing module is maintained at a constant temperature of 26°C (±1°C) and a humidity of 60 (±2%). The rearing module comprises water bays supplemented with a food source to ensure that larvae fed to satiation. After around the 7 days from placing the eggs in the rearing module the larvae begin to pupate. Thus, after a predefined time period the larvae/pupae are collected and mechanically sorted using a sieve module 20.

[0061] In some arthropod species male pupae typically emerge earlier than female pupae (e.g., 1-2 days), and thus the predefined time period at which collection is begun, and/or the duration of collection may be selected to bias the population of pupae towards male pupae. Several trials were conducted based on rearing and sieving larva over a period of up to 17 days. The sieved larva was counted, and the sex determined and Figure 6A is a plot showing the proportion of total male (black) and female (white) mosquitoes emerging since time of hatching. Thus, in one embodiment the predefined time period is 9 days, and the collection period is 3 days (that is collection is performed on days 9 to 12) after which the rearing module is emptied and reset to begin a new rearing cycle.

[0062] The sieving module performs mechanical sorting of the reared larvae/pupae. In one embodiment manual sorting is performed. In another embodiment an automated sieving system may be used in which rearing containers are automatically poured into a sieve which is mechanically agitated using servomotors or similar electronically controlled motor arrangements (e.g., belts, cams, gears, etc.). In one embodiment the reared larvae/pupae are forced through a lift/drop and lengthwise shake of the sieve apparatus through the water column of a collection container 30. In some arthropod species there are morphological differences between male and female pupae. Thus, in one embodiment the sieve apparatus 20 comprises a plurality of apertures 22 with a shape and size configmed to exploit morphological differences between male and female pupae. This preferentially allows male pupae (the first sex or the selected sex) to pass through the sieving apparatus and to retain female pupae (the target sex to remove). In one embodiment the sieve has a slotted base structure with perpendicular struts 24 and the slots 22 are angled to form a groove (e.g., a regular trapezoid cross sectional profile or a curved cross-sectional profde) to enhance mechanical action. For example, male mosquito pupa are smaller than female pupae so using slots with a grooved, trapezoidal or curved cross sectional profile allows male pupae to align (roll into) and pass through the slot, whilst retaining the larger female. Figure 6B is a representation of a sieve apparatus 20 in one embodiment. In this embodiment the slots 22 are formed between elongated struts 26 each with a regular trapezoidal profile 26 such that the slots also have a regular trapezoidal profile with the lower length smaller than the upper length. In one embodiment the apertures are cut using a router to provide a curved profile. In one embodiment a sieve was constructed from Aluminium with a slot gap of 0.9mm. In a series of trials with this router cut Aluminium sieve in which pupae were sieved on days 9-12 post hatching a female contamination rate, that is the proportion of females passing through the sieve, of between 0 and 4.5% was achieved (with an average of 2.7%). Larvae that do not reach pupation may be removed manually or passively through a second sieve system with a finer slot gap (to allow larvae, but not pupae through). Larvae exhibit phototaxis and thus a lighting system may be used to repel the larvae through the second sieve system. The male wastage rate, which is the proportion of males that did not move through the sieve system and were classified as female varied between 18.5 % and 45.5% (with an average of 27%). It was noted that the female contamination rate was inversely related to the male wastage rate. Thus, a sieving system can obtain a mosquito population with around 95-99% males and 1- 5% females. The sensitivity, specificity and accuracy ranges were 0.91-0.99, 0.91-0.98 and 0.82-0.83 respectively. Sensitivity measures the proportion of positives that are correctly identified. In this case it would be the proportion of males correctly sorted through the sieves. Specificity measures the proportion of negatives that are correctly identified. In this case it would be the proportion of females correctly sorted by the sieve. Accuracy combines both the sensitivity and specificity into one metric. [0063] After sieving the pupae passing through the sieve slot gaps are collected in a collection (or emergence) container 30, where they may rest to allow adult mosquitoes to emerge and allowed to enter a behavioural sorting module 40. In one embodiment a permanent connection may be provided between the collection container and the first behavioural sorting chamber 42 to allow emerging mosquitoes to fly into the behavioural sorting chamber 42. In another embodiment the pupae are left in the collection container 30 for a predefined time period, such as 1-2 days and then any adult mosquitoes are placed in the behavioural sorting chamber 42. This may be achieved by removing a barrier between the chambers and/or using an air flow system to push or suck mosquitoes into the first behavioural sorting chamber 42 from the collection container. In one embodiment an opening or barrier between the collection container 30 and the behavioural sorting chamber 42 may be periodically opened, such as every six horns and an air flow system used to push or suck emergent mosquitoes into the first behavioural sorting chamber 42 from the collection container.

[0064] The behavioural sorting module 40 comprises one or more behavioural sorting chambers 42 each having an exit aperture 52. In one embodiment (as illustrated in Figure 1 A) a single behavioural sorting chamber is used. However, in other embodiments multiple behavioural sorting chambers arranged as a sequence or series to progressively reduce the female contamination rate. The behavioural sorting chamber(s) comprise one or more lures and an air flow system 91 configured (or controlled) to direct air into or out of the exit chamber of each sorting chamber based on a type of lure to lure arthropods of the non-target sex through the one or more sequential sorting chambers. Various lures may be used including audio, visual, and chemical attractant lures (include chemical attractants lures for both sexes). In some embodiments a swarm marker may also be included in the chamber and integrated into the exit aperture.

[0065] In this embodiment, each behavioural sorting chamber 42 comprises a swarm marker structure 50, an audio hire 44, and an air flow system 91. In the wild male mosquitoes tend to swarm (clump or aggregate) over specific landmarks known as swarm markers. These typically comprise of a central dark colour object surrounded by a contrasting light coloured area. The male mosquitoes continuous fly around the dark object and female mosquitoes that fly into or near the swarm are mated with. The swarm marker 50 comprises a first colour portion 54 comprising an exit aperture 52 and a second colour portion 56 surrounding the first colour portion wherein the second colour is a contrasting colour to the first colour. An embodiment of a swarm marker structure is further illustrated in Figure 7. In the embodiment shown in Figures 1 A and 7 the swarm marker is a tubular structure with a black (although it could be a dark colour such as red, dark grey or navy blue) that projects into the sorting chamber from a base wall of the sorting chamber 42. The base structure 56 is an annular disk embedded in or resting on the base wall and constructed of a white colour, although other light colours such as off-white, cream, or light yellow could be used. The exit aperture 52 is located at a distal end of the tubular structure 54 with respect to the base a wall of the sorting chamber 42. The tube 54 connects the exit aperture 52 to an aperture 53 (dashed white circle in Figure 7) in the base wall of the sorting chamber 42. The exit aperture 52 of the swarm marker 50 thus acts as an exit aperture for the behavioural sorting chamber 42. In one embodiment the swarm marker 50 is located in the centre of the base of the sorting chamber 42.

[0066] Additionally, an audio hire 44 may be located adjacent the exit aperture of the swarm marker structure. In one embodiment the audio lure is a speaker which plays an audio signal at a frequency or frequency range determined from a wingbeat frequency range or wingbeat tone of the female mosquito (target sex) 45. A collection tube is located below and connected to the base of swarm marker and a suction system is configured to draw air from the sorting chamber through the exit aperture (swarm marker) and collection tube 58. In one embodiment a bladeless fan 92, which is part of the air flow system 91, is embedded or located in the walls of the collection tube 58 just below the base of the swarm marker 56. Thus, as males fly around the swarm marker they become within range of the audio lure and thus they fly towards the speaker 44 (source of the sound 45). Whilst the audio lure is operating, the air flow system is configured to suck air through the exit aperture (e.g., a negative pressure air flow) so as to pull/suck the male mosquitoes (non-target arthropods) through the exit aperture of the swarm marker 50 and into the collection tube. Once the audio lure is switched off, the air flow is changed such that the air flow system provides zero or positive air flow into the exit aperture to prevent the lured mosquitoes exiting the sorting chamber. For example, fan 92 in the walls of the collection tube 58 may be switched on when the audio hire 44 is switched on. The air flow system 91 and audio lure may be configured to switch state at the same time, or a delay may be introduced such that the air flow system (e.g., turn fan 92 on) starts, or switches state, a short time after the audio lure (e.g., l-2seconds) is started. The air flow system may change state (e.g., turn fan 92 off) at the same time the audio lure switches off, or a short time after the audio hire switches off. For example, the delay could shift both start and stop times by the same amount. In some embodiments alternative arrangements could be used. For example the air flow system could be configured to create zero of positive air flow when the audio lure is on in order to allow a swarm to form around swarm lure and the audio lure, and when the audio lure is switched off, or just prior to the audio lure switching off, the air flow system is configured to create negative air flow to suck the swarmed mosquitos through the exit aperture. A computer (microprocessor 60') may be configmed to repeatedly switch on the audio hire 44 for a short time period (which we will define as a hire time period) to lure the male mosquitoes and to control the air flow system 92.

[0067] The air flow system 91 comprises one or more fans 92, blower 94, and a controller such as microprocessor 60' configured to switch the fans on or off and control the fan speed. The air flow system may also comprise connective tubing such as collection tube 58 which connects the sorting chamber 42 to target chamber 72 and final collection chamber 98. In other embodiments alternatives to fans such as pump systems could be used to create controlled air flows. In one embodiment the audio hire 44 and a first bladeless fan are synchronised so that the first fan is switched on for the hire time period to create a negative air flow in the chamber (to suck or draw lured mosquitoes to leave the chamber through the exit aperture) and then the first fan is switched off together with the audio hire for a second time period. A second fan 92 adjacent the target chamber may also be used to create a negative pressure air flow from the connecting tube 58 into the target chamber 72. Note that when the first fan is creating negative airflow with respect to the exit aperture of the sorting chamber 42, this air flow is positive with respect to the second fan. In some embodiments the operation of the first fan and second fan are synchronised and both spin in the same direction to draw arthropods through the connecting tube 58. In other embodiments when the first fan is switched off the second fan is switched on to draw arthropods through the connecting tube 58 towards the target chamber 72. In some embodiments the second fan is continuously switched on, with the speed of the second fan varied based on the state of the first fan, for example speeding up when the first fan is off. In one embodiment the lure time period and second time period are both 10 seconds. In other embodiments the lure time period and second time period are different with the hire time period being shorter than the second time period. The lure time period may be a short time period, such as less than 30 seconds. This allows the mosquitoes to be pulled through the system in cohorts or batches. The second fan may alternate operation with the first fan or may operate continuously with the first fan wherein the first fan is of sufficient power to override any negative airflow created by the second fan. In other embodiments the first and second fans could be combined, with the fan switching between creating positive air flow when the audio lure is off, and then reversing direction to create a negative air flow when the audio lure is on. In another embodiment a single fan at the base of the exit aperture is switched off (no air flow) when the audio lure is off, and switched on to create a negative air flow through the exit aperture when the audio lure is on. In one embodiment the computing apparatus 60 is a dedicated microprocessor board 60’ such as an Arduino microprocessor board comprising processor 62 and memory 63, and is configmed to control the audio lure and the air flow system, including controlling the speed and timing of individual fans 92 to create desired air flows (e.g., direction and pressure). The Arduino microprocessor may be programmed or configured to provide synchronised control of the audio lure and fan. Controlling the fan may include setting a fan speed such as by controlling a voltage level or using pulse width modulation (PWM) and controlling a delay between switching state of the audio lure and state of the fan 92. The microcontroller may control ramp up/ramp down to a desired fan speed.

[0068] It is further noted that female mosquitoes tend to rest on or fly high and near the side walls of cages to stay away from the swarm marker 50 and avoid harassment by male mosquitoes. Thus, in one embodiment a female chemical attractant hire such as blood feeding pad or heat pad is located on the walls of the behavioural sorting chamber 42. In some embodiments the blood feeding pad may be spiked with an insecticide to kill females. Other forms of hires include visual lures and male attractant lures, and multiple lures or combinations of lures may be used such as an audio lure near the exit aperture and a second blood feeding pad (female chemical attractant) lure on the walls, or audio and visual lures. Visual lures include light of a specific frequency or a frequency range (for example by using a light source and band pass filter) and/or lures of specific shapes including visual contrast patterns. In one embodiment a male chemical attractant is placed in the final collection chamber 98 and the air flow system 92 is configured to gently blow the attractant through the system and thus create a positive air flow through the target chamber 72 and into the exit aperture 52 of each sorting chamber 42.

[0069] Testing of a single stage behavioural sorting module 40 resulted in a mean sensitivity of 0.99, a mean specificity of 0.68 and a mean accuracy of 0.84. It is considered that sensitivity is the most important criteria for this stage and indicates that behavioural sorting can effectively sort mosquitoes. In some embodiments the behavioural sorting module 40 may use a multiple behavioural sorting chambers 42 connected in series by connection tubes 58. Using multiple behavioural sorting can thus further reduce female contamination rates. Using multiple chain behavioural sorting chambers can effectively reduce the female contamination rate. Further it only requires low power sufficient to run the microcontroller 60 and suction/fan system 92 making it suitable for use in areas and countries without access to reliable energy sources. For example, a 12V battery source can provide adequate power to run the behavioural sorting module 40.

[0070] After passing through the behavioural sorting module 40, the mosquitoes are passed to a computer vision tracking and targeting module 70. This comprises a targeting chamber 72 comprising an entry aperture 73 and an exit aperture 74 and an air flow system 91 configured to direct air flow through the chamber from the entry aperture 73 to the exit aperture 74. The air flow system 91 may include bladeless fans 92 located in connection tubes 58 prior to the entry aperture and after the exit aperture 74, as well as a blower 94 for removing immobilized mosquitoes from the system. The air flow system may be controlled by the dedicated microprocessor board 60. Bladed fans and other suction or blower systems may be used to create directed controlled air flow through the system. Bladeless fans have an advantage over bladed fans as they reduce the likelihood of damaging mosquitoes flying through the system.

[0071] The computer vision tracking and targeting module 70 further comprises at least two image sensors 76 78 which are spaced apart and each has a respective field of view 77 79 that share an overlapping region including at least a portion of the targeting chamber 72, or a single image sensor 76 and a mirror arrangement 76' configmed such that a field of view 77 of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber 72. The at least two image sensors or single image sensor and mirror arrangement are configured to generate a stream of video images. Each output image from the single image sensor may be split to generate two images equivalent to the two images that would have been collected from the two image sensors 76 78. In some embodiments the image sensors are cameras including one or more image sensors and associated optical assemblies including video cameras capable of capturing 4K video, and may be configured to capture colour or monochrome images. A range of image sensors may be used including CCD, CMOS based image sensors with a range of optical assemblies, filters and sensor sizes (e.g., number of pixels). The image sensor may be an integrated camera system such as a smart phone camera, or digital video camera and may comprise multiple image sensors each with an optical assembly, such that each image sensor has a different field of view, magnification range, and/or colour sensitivity. In one embodiment a 4K blackfly camera may be used.

[0072] The two image sensors 76 78 are spaced apart and their fields of view a share an overlapping region including at least a portion of the targeting chamber 72 (i.e., an overlapping field of view). In one embodiment the targeting chamber 72 is a rectangular box 7, with a first camera (including a first image sensor 76) located in front (and looking into) the targeting chamber 72 with a first field of view 77, and a second camera (including a second image sensor 78) located above (and looking down into) the targeting chamber 72 with a second field of view 79. In this embodiment the two cameras are orientated substantially orthogonal (e.g., the angle between the two vectors is within 5° of 90°) so that the intersection of the field of views 77, 79 includes the entire targeting chamber 72. In other embodiments additional cameras could be added to provide additional fields of view of the targeting chamber 72. In other embodiments, such as illustrated in Figure 11, the computer vision tracking and targeting module 70 may comprise a single image sensor 76 (e.g. a single camera) and a mirror 76' configmed (or located) such that the field of view 77 of the image sensor 76 is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion the targeting chamber 72. In this embodiment the second image sensor 78 is replaced with a mirror 76' to provide the single image sensor with a view through the top panel 72c of the chamber 72 in addition to view of the front panel 72d. The field of view of the camera 77 thus simultaneously views the front panel 72d in a lower portion of the field of view and views the top panel 72c in an upper portion (upper and lower portions could alternatively be referred to as first and second portions). The use of a mirror to replace the second camera 78 allows a reduction in the height of the housing 88 and may reduce the cost of the camera system, and/or allow the use of a single image sensor or camera with better performance characteristics such as having higher quality lenses, resolution, dynamic range, capture and download speeds, etc) compared to systems with two image sensors or cameras.

[0073] The computer vision tracking and targeting module 70 also includes a directed energy system comprising a directed energy source 80 and a targeting apparatus 82 for directing the directed energy beam 84 at a target position 86 within the targeting chamber 72 and a computing apparatus 60 comprising at least one processor 61 and at least one memory 62. The at least one memory 62 comprises instructions for configuring the at least one processor 61 to perform a computer vision tracking and targeting method for sex based sorting of arthropods.

[0074] A flow chart of an embodiment of this method is further illustrated in Figure IB. Figure 2 is a schematic diagram of an embodiment of a computer vision tracking and targeting module 60 for sex based sorting of arthropods implemented in the system shown in Figure 1 A. At step 104 the computer vision tracking and targeting module 60 is configured to capture a stream of video images either from each of at least two image sensors 76 78 or from the single image sensor 76 (e.g., a single camera) and a mirror 76'. As noted above the at least two image sensors are spaced apart and a share an overlapping region including at least a portion of a targeting chamber 72, or a single image sensor and a mirror arrangement configured such that a field of view of the single image sensor is split into two portions where each portion has a respective field of view that share an overlapping region including at least a portion of the targeting chamber through which multiple arthropods (mosquitoes) move from the entry aperture 73 to the exit aperture 74 under assistance of an air flow system that directs air flow through the chamber from the entry aperture to the exit aperture. Each portion comprises a different set of pixels in the image, and thus the output image from the single image sensor may be split to generate two images equivalent to the two images that would have been collected from the two image sensors 76 78, or the system may be configured to store the respective pixel regions of the image sensor corresponding to the two fields of view such that each region can be processed separately, i.e. as separate images, as would be the case where two image sensors were used.

[0075] The tracking and targeting method further comprises tracking a three dimensional position of one or more arthropods using the stream of video images from each of the at least two image sensors 105, and determining a sex of an arthropod by identifying the arthropod in at least one image in the stream of video images and using a trained machine learning classifier to estimate the sex of the arthropod 106. The method proceeds to predict a target position at a future time of an arthropod classified as a target sex (e.g. female) using the tracked three dimensional positions of the arthropod 107; and a directed energy beam 84 at the target position 86 at the future time to kill or incapacitate (e.g. maim/sterilise/immobilize) the target arthropod (female mosquito).

[0076] In this embodiment the directed energy source 80 is a laser, the directed energy beam 84 is a laser beam and the targeting apparatus 82 is a mirror galvanometer system. A laser enclosure 88 is located around the directed energy system 80, cameras 76 78 and targeting chamber 72 to safely enclose and shield the system. The orientation of the mirror galvanometer 82 is controlled by the computing apparatus 60 to direct the laser beam 84 at the predicted target position within the targeting chamber 86 at the future time. However, in other embodiments other directed energy sources such as radio frequency sources including microwave and terahertz sources, and even X-ray sources may be used. In one embodiment the laser is a 15 Watt laser with 1x3 mm spot beam. However, in other embodiments high powered lasers such as a 35 W laser with a 6x6 mm spot can be used. The laser energy can also be controlled using a PWM controller.

[0077] The sex sorted arthropods are then collected 109 in a collection chamber 98 which is connected to the exit aperture 74 of the targeting chamber 72 via a connecting tube to collect arthropods of the sorted (i.e., non-targeted) sex after traversing the targeting chamber 72. The connecting tube may include a female (or target sex) collection chamber to collect immobilised (killed) mosquitoes hit by the directed energy beam.

[0078] Figure 2 is a schematic diagram of the computer vision tracking and targeting method for sex based sorting of arthropods implemented by the computing apparatus 60 according to an embodiment. In one embodiment the computer vision tracking and targeting method comprises a three dimensional realtime tracking module 320 and an Artificial Intelligence/Machine Learning Classification module 210 which process the incoming stream of images in parallel. When a female mosquito is identified, the locations are provided to a targeting module 230.

[0079] The AI/ML classification module 210 processes a classification image stream and the 3D realtime tracking module processes a tracking image stream. Each of the classification image stream and the tracking image stream is generated from at least two image sensors each in different locations, and thus each stream contains sub-streams from each image sensor, or in the case of a single image sensor and a mirror 76, each of the classification image stream and the tracking image stream is generated from each portion of the image corresponding to the two fields of view and thus each stream contains sub-streams from the two portions of the image sensor. The output of an image sensor may be split to generate a tracking stream and classification stream, or additional image sensors may be used (e.g. four image sensors in total) and each dedicated to generating a stream of images for either the tracking stream or the classification stream from the respective location of the image sensor (and associate field of view). In the embodiment illustrated in Figure 3C, two image sensors (e.g. cameras) are located in two different locations so each has a different (orthogonal or near orthogonal) field of view and the incoming image stream from each image sensor are split into the two image streams, one of which is passed to the classification image stream and the other to the tracking image stream. That is, a tracking stream of images is generated for each of the image sensors (e.g., cameras) and each of the tracking streams are synchronised. Further each image is down converted to a predefined lower resolution image size. In one embodiment the incoming stream of images are 4K images (3840x2160 pixels) which are down converted 4 times to 1024 by 768 pixel images (e.g., XGA resolution) for tracking. Similarly, a classification image stream is generated for each of the image sensors (or cameras). This second image stream (classification stream of images) comprises the input stream of full frame 4K images from the image sensors/cameras. The tracking of the three dimensional position is performed using the tracking stream of images and determining a sex of an arthropod is performed in using the classification stream of (2D) images, and the steps of tracking a three dimensional position and determining a sex of an arthropod are performed in parallel. Other image resolutions could also be used, with the choice dependent upon the available processing power and processing requirements of each stream. In the embodiment shown in Figure 3C, two image sensors are used and the output image stream of each image sensor is split to create the two image streams (classification and tracking) for that location. In another embodiment additional image sensors are used to generate independent image streams to avoid splitting of output image streams. For example, in one embodiment, four image sensors are used and are divided into two sets two image sensors. The first set is used only to generate the tracking image stream and the second set is used only for generating the classification image stream. In one embodiment image sensors are paired such that an image sensor from the first set and an image sensor from the second set are co-located, with each pair at a different location. The images used in the tracking stream are of lower resolution than the images used for the classification stream, and thus different types of image sensors can be used for the respective stream. In some embodiments the image sensors used for generating the tracking stream are of lower native resolution, or operate at a lower resolution, than the image sensors used for generating the classification stream. In embodiments with three or more image sensors, some of the image sensors generate single image streams (i.e., for either the classification image stream and the tracking image stream) and other image sensors generate image streams which are split into two image streams (i.e., one for the classification image stream and one for the tracking image stream). In embodiments with a single image sensor and mirror, each portion of the image, corresponding to the respective fields of view, are split into the classification image stream and one for the tracking image stream (i.e., a single image sensor generates four sub-streams of images).

[0080] To track in three dimensions, we first process two dimensional images in the tracking image stream by subtracting video frames from one another. In one embodiment we generate a first subtracted image captured at a first time by comparing a first image at the first time with at least the previous frame from the tracking stream of images (from the same image sensor). In one embodiment we subtract the previous image from the first image to eliminate stationary pixels (or mosquitoes). In other embodiments several previous images could be combined to use as the image to be subtracted. Similarly, we generate a second subtracted image captured at the first time by comparing a second image at the first time with at least the previous frame from the second tracking stream of images.

[0081] We then identify one or more arthropods in the first subtracted image and estimate a first position for each identified arthropod and identify one or more arthropods in the second subtracted image and estimate a second position for each identified arthropod.

[0082] This is illustrated in Figure 4A which shows a tracking stream of images from a first camera 410 in which previous image 412 is subtracted from current image 413 to generate subtracted image 420. An embodiment of a subtracted image 42 is illustrated. Mosquitoes 422 which have moved since the previous frame stand out in the subtracted in the image. An object detector algorithm may be run over the subtracted image to generate a bounding box enclosing an arthropod whose position we wish to estimate. The object detector may be a machine learning based object detector trained on images of mosquitoes and may be implemented in computer vision library such as OpenCV or Tensorflow. [0083] Tracking of the three dimensional position of one or more arthropods is based on a ray tracing approach. We determine the three dimensional position for each identified arthropod through ray tracing, which uses the position of the first sensor and the first position to generate a first ray, and the position of the second sensor and the second position to generate a second ray. Each position may be an estimate of a centroid of the arthropod within the bounding box where the two rays intersect. The position may also be determined using pattern matching, for example to identify the thorax or abdomen of the mosquito. The three dimensional position is determined based on an identifying an intersection point of the first ray and the second ray within the targeting chamber.

[0084] This is further illustrated in Figures 3A to 3C. Figure 3 A is an exploded view of the targeting chamber 72 illustrating a lighting system according to an embodiment. In this embodiment the lighting system comprises a pair of translucent walls 302 304 which are backlit with lighting panels 312 314. The translucent walls are provided on each side of the targeting chamber 72 distal from each camera to backlight the targeting chamber 72 with respect to the camera. In this embodiment a first lighting panel 312 is located behind the target chamber 72 to backlight the view of the target chamber from the first camera 76 located in front of the target chamber 72, and a second lighting panel 314 is located below (and supporting) the target chamber to backlight the view of the target chamber from the second camera 78 located above the target chamber 72. Backlighting may be provided by LED strips mounted to a panel, or PCB mounted LED, a commercial off the shelf light box, or an LCD backlight panel. In one embodiment each lighting panel comprises an array of 600 LED lights generating 4000-5000 lumen of light. Figure 3B shows a pair of images 320 321 captured by a pair of orthogonal cameras with approximately orthogonal views of the targeting chamber according to an embodiment. Moving mosquitoes 322, 323 and 324 are identified by bounding boxes.

[0085] Figure 3C which is a schematic view illustrating the relative geometry of each camera with respect to the targeting chamber and ray traces used for estimating positions of mosquitoes in the according an embodiment. First camera 76 has field of view 77 and projects first ray 341. Second camera 78 has second field of view 79 projects second ray 342. The first ray 341 and second ray 342 intersect at point 343. The relative geometry of the cameras, their pointing directions and the location and dimensions of the targeting chamber are stored in a memory 63 of a computing apparatus 60 associated with the computer vision tracking and targeting module 70. This allows the three dimensional position of the mosquito to be triangulated and estimated. In this embodiment the cameras are located orthogonally with respect to each other. However, in other embodiment the two cameras could be in non-orthogonal geometries including side by side placement provided the positions, pointing directions and fields of view of each camera is known. Additional cameras could also be provided to provide additional rays to be used in triangulating the position. [0086] In some embodiments there may be no exact intersection and an intersection point may be determined based on assuming an error radius or error threshold. In this embodiment if any two rays are within a threshold distance of each other they are deemed an intersection point. In another embodiment each ray may be assigned a volume. For example, the ray may the axis of a cylinder with a fixed radius or a cone in which the ray is an axis, and the walls are based on angular error radius to create a solid angle. These approaches allow intersection volumes to be determined and intersection points may be a centroid or mid-point of the intersection volumes.

[0087] Once a three dimensional position has been estimated we next determine if the identified arthropod exists in a tracking database (that is, was it identified in a previous image). This may be performed by comparing the position with a previous position or an estimate of a future position of a previously identified arthropod. Image comparison methods may also be used, for example by comparing the image with a previous image. If the identified arthropod exists in a tracking database, we obtain the identifier otherwise we register a new identifier for the identified arthropod. We also store information used to identify the arthropod for future comparisons/searches. We then store the three dimensional position of the arthropod, and the associated order (e.g.an index), time and the arthropod identifier in the tracking database.

[0088] In parallel with the tracking process, we also determine a sex of an arthropod. This is performed by first identifying one or more arthropods in each image in the classification stream of images. We generate a candidate arthropod image by cropping the original image to a standard (i.e., fixed or predetermined) classification image size around an identified arthropod, for example to a bounding box with a predetermined size which contains the identified arthropod. In one embodiment this is a 160x160 pixel image. However, in other embodiments other sizes may be used depending upon the processing capabilities available. Multiple candidate arthropod images may be generated from the original image. Each candidate arthropod image is provided to a trained machine learning classifier to obtain an estimate of the sex of the arthropod. If the estimated sex matches a target sex (e.g., female) we then obtain the identifier of the arthropod from the tracking database. This can be performed based on the position of the arthropod in the original image from which it was cropped (in image coordinates) by searching the tracking database (generated by the tracking image stream). For example, a record in the tracking database may store the identifier, three dimensional position, and the original position (in image coordinates) of the mosquito in each image used to generate the three dimensional position. A search of the database could be performed for a record with original image position(s) that match the current original image position. Alternatively, a coarse 3D position could be estimated using ray tracing and matched to a position in the tracking database.

[0089] Once a female mosquito (or target sex arthropod) is identified the targeting module 230 continues to track the position of the target mosquito. This may comprise analysing the previous set of positions using a Kalman filter to predict the future position at a future time and then firing the laser at the future time to incapacitate the mosquito. The future time takes into account processing delays and time taken to orient the galvanometer mirror to target the mosquito with the laser pulse. This is further illustrated in Figure 5D. In this embodiment frames are captured every 33ms and processing of images and estimation of the future position requires 13ms. The future time is thus selected to be at most the sum of 13ms and 33ms, such as 46ms after the current frame. In another embodiment, this future time is reduced through the use of monochrome cameras which allow faster image processing resulting in an 8ms delay between frames.

[0090] In this embodiment each camera observes the full chamber. However, in other embodiments the cameras are only required to view an overlapping portion which can be targeted with a laser system. Multiple camera and laser systems could be provided, either for use with the same target chamber or as multiple modules provided in series.

[0091] Camera systems are selected to ensure they have a sufficient depth of field across dimensions of the targeting chamber. In one embodiment a 4K (~8 Megapixel) camera operating at 30fps was used. Camera systems that generate 4K uncompressed image stream are preferred over compressed streams as it ensures video processing is as fast as possible. Testing also indicated that colour was less than 5% of information in images, and thus monochrome cameras may be used. Monochrome image sensors/cameras often have the advantage of providing higher resolution and lower processing time than comparable colour image sensors/cameras.

[0092] In one embodiment the machine learning classifier is trained using an Xception architecture, although other architecture and algorithms may be used. Figure 4B is a schematic illustration of an Xception architecture and associated input, middle layer and output images according to an embodiment. The training system should be as similar as possible to the usage system. In the case of a change to the system (difference dimensions or lighting) the system can be rapidly retrained in a few days. In one embodiment 1024-dimensional vectors were used rather than 2048. However, it is considered that either value would work well. All weights in the model were randomly initialised.

[0093] Images acquired from high definition video feeds of contained mosquitoes in a laboratory were used as training data. Videos of male and female mosquitoes were taken using containers of only male or only female mosquitoes so that hand labelling of images was not required. Two cameras were used to record the mosquitoes: (i) a camera facing the container from side-on; and (ii) a camera facing the container from above. Images of mosquitoes were extracted from the video recordings and saved as 160 by 160 pixel PNG files with red, green and blue (RGB) colour channels. Pixels in these images were easily converted to grayscale by combining their channel values using: Pgray ^'^ -Pred T ^'^^’ ‘Pgreen T ^‘^^Pblue’ Equation 1 where p gray is the pixel value of the grayscale image, and p red , p green and Pbiue are the pixel values from the three colour channels of the original image. In total, there were 263,326 images of males and 748,688 images of females taken from a top-down camera and 314,560 male and 853,297 female images from the side-on camera. 75% of these images were used for training and the remaining 25% for validation of the model.

[0094] We trained four deep neural network models with the objective of accurately classifying male and female mosquitoes from these images. Each of the models had an identical architecture but was trained on imagery acquired in slightly different ways, namely: (i) side-on camera and colour images; (ii) side-on camera and grayscale images; (iii) top-down camera and colour images; and (iv) top-down image and grayscale images. Models were built and trained using Keras (v 2.2.5; Allaire and Chollet, 2019) for R (v 3.4.0; R Core Team, 2019). PNG files were imported into R using the “png” package (Urbanek, 2013).

[0095] The architecture employed for training the models was based on the Xception model (Chollet, 2017). The outputs of the Xception model were then fed into a layer that performed two-dimensional average pooling, then into a densely connected layer with 1024 nodes and a ReLU activation function, before finally being fed to a single output node with a sigmoid activation function.

[0096] Training was undertaken using binary cross-entropy loss and the “RMSProp” algorithm. We randomly sampled batches of 32 images from the image pool, ensuring that 16 images were male and 16 were female. We assessed predictive accuracy every 100 batches, which we refer to as “checkpoints”. We used a keras “model checkpoint” callback which reduced the learning rate by a factor of two if the validation accuracy (performed on another 100 batches from the validation set) had not improved over the past 20 checkpoints. Each model was trained for 24 hours using a single Nvidia Tesla P100 GPU, 16GB RAM and a single CPU.

[0097] Once the four models were trained, we used each of them to classify the 25% of the image data that had been held back for validation. Classification was performed using a range of classification thresholds (i.e., the value between 0 and 1, that the model’s output would need to exceed in order to be classified as a female). We could then ascertain the proportions of males and females that would be eliminated by each of our models. This allowed us to ascertain two important quantities: (i) the level of female contamination that could be expected from each classifier with each threshold; and (ii) the proportion of males that would be lost through misclassification as females.

[0098] Tables 1 - 4 outline the proportions of males and females that would be eliminated by each of the models developed under different classification thresholds. Each row of these tables is computed by applying the classifier to 187,172 female images and 78,640 male images. For each image, each classifier outputs a value between zero and one, where zero corresponds to male and one to female. The threshold column corresponds to where we draw the line in this interval when assigning an image to the male or female categories.

[0099] Xception architecture provides a highly accurate classifier of mosquito images extracted from video. Models set at a low threshold, such as 0.01, lead to a very low female contamination probability and also a very low proportion of males destroyed. It is also evident that there is much greater classification accuracy from models trained using a top mounted camera compared to a side mounted camera. The use of colour images and a top mounted camera, resulted in a female contamination rate of 2.67 per 100,000 females, whilst for the side mounted camera with colour images, the contamination rate was 4.48 per 1,000 females.

[00100] The use of colour images for the top mounted camera helped to reduce the female contamination probability compared to grayscale images by almost an order of magnitude. The use of colour on the side mounted camera appeared to be less important, with grayscale images achieving about half the contamination of the colour images. We do note however that in the case of the side mounted images, predictive accuracy was far lower than for the top mounted camera and so if a choice was to be made about whether to use colour or not, it would appear that colour would be best. This trial was performed using only top down images and it is estimated that when using two orthogonal views a female contamination probability of around 1.2 per 10 million females could be achieved.

[00101] A simple experiment was performed to characterise the use of the above described tracking algorithm to target moving mosquitoes with a low-powered laser as a proof of principle exercise. A single mosquito was released into the flight chamber and flight patterns recorded. The laser was set to constantly track the mosquito and for each video frame the mosquito moved within the computer vision system, a targeting prediction was made, and the accuracy of how close this prediction was to the actual mosquito position was recorded. Mosquito target accuracy was visualised using the R libraries ‘plotly’ and ‘rgl’ and descriptive statistics were calculated. Finally, we estimated the accuracy of the laser target in relation to the speed of mosquito movement. The process for calculating targeting accuracy was as follows:

[00102] As discussed previously calculations were made between each iteration of simultaneous 4k frames from the video stream of two synchronised orthogonal cameras (top and front).

[00103] The first measurement is the real position of the mosquito in three-dimensional space in the laser sorting chamber. As discussed above this was performed by detecting the mosquito centroids from the difference of sequential frames. The centroids of the same mosquito from each camera were then matched up using the ray-tracing method discussed above. With the correct matchup of the two centroids, the actual mosquito 3D location is calculated.

[00104] This 3D location of the mosquito is then matched to the database where locations of each mosquito were identified in the past.

[00105] We then predict the future position of each mosquito using a Kalman filter algorithm (Kalman, 1960).

[00106] The prediction of a mosquito’s future location is where the laser is targeted. It is converted to X and Y voltage for the galvanometric controller, which turns the two-axis mirror and redirects the laser beam to intercept the mosquito.

[00107] The targeting error presented here was calculated by measuring the difference in the next 3D position of the mosquito with that predicted by the laser targeting point.

[00108] Accuracy of this embodiment of the targeting algorithm varied depending on the speed and change in linear direction over which the mosquito flew within the flight chamber, or velocity. The prediction error (inaccuracy) followed a linear relationship with velocity (see Figure 5A, r 2 = 0.85, y = 0.993 lx + 0.2124) with a mean error of 2.24mm (SD ± 3.47mm) and mean velocity of 2.01mm (Figure 5C, SD ± 3.36mm). Importantly, 79.9% of error measurements fell within the focal length of the estimated future position of the laser spot, which was 3mm in this embodiment (14,394/17999). Figure 5 A is a visualization of prediction error rates relative to mosquito movement within the target chamber where darker gray colours (darker red colours in the drawing in the original priority application) represents an increase in error (inaccuracy) relative to the velocity of mosquito movement, measured as the difference in mm between two video frames at 33fps according to an embodiment. Each point in the flight path is assigned a colour corresponding to the error according to the legend on the side, with grey corresponding to a zero error and red corresponding to an error of 40mm (i.e., error range from 0-40mm).

[00109] A second test of the AI/ML tracking system was performed within the system. Males and females were hand sorted and released into the behavioural sorting box, where males attracted to a lure are then tracked and classified by the computer vision and neural network systems in real-time.

Individuals were given an ID number when they moved into the computer visions system which was used to distinguish between adults. A decision was made between 0 and 1 on whether each adult was male or female, respectively. A conservative decision threshold was set at 0.01 on differentiating whether something was male or female to simulate what would occur during normal operation. For each classification step, an image crop was made of adult mosquitoes being tracked and each crop stored to measure accuracy of each classification by human eye. We recorded actual sex, and whether the classification was accurate, including a false discovery (male classified as female) and false negative (female classified as a male) rate were calculated alongside sensitivity, specificity and accuracy.

[00110] A total of 3,182 classifications were made on 604 individual adult mosquitoes. This was further broken down into 444 and 2,534 correct male and female classifications, respectively, and 204 incorrect classifications (Table 1). This resulted in a neural network sensitivity, specificity and accuracy of 0.97, 0.87 and 0.94, respectively. As we were primarily interested in females being misclassified, we observed a false negative rate of 0.03, with 66 classifications incorrectly labelled as males (Table 1).

TABLE 1

Accuracy of AI/ML classifier within the system.

[00111] To test the accuracy of laser targeting within the system, we ran a trial where all mosquitoes that flew in front of the cameras were targeted and an attempt to immobilize them was made. Neural network classification was disabled to ensure the optimal efficiency of the system. We recorded individual mosquito IDs, the duration of tracking and targeting, whether the mosquito escaped back to the direction it originated (escaped - yes/no) and whether it was immobilized or not (yes/no). Whole 8MP screen images were recorded at 33fps to determine the outcome of an immobilization attempt. A comment was made on outcomes if it was not clear what had occurred or if there was a partial hit.

[00112] During the first real-time trial of the laser immobilization system, we recorded a total of 156 identifications of which 50% (78/156) were immobilized. As mosquito flight is sometimes nonlinear, mosquitoes entered but returned to the direction they originated and did not cross the sorting container. These were further classified into returned immobilized (yes) and returned not immobilized (no; Table 7). Our mean time to immobilization was 0.72 seconds and out of those mosquitoes which escaped through the system, 8 were hit by the laser but not immobilized. TABLE 2

Summary results of the first immobilisation trial.

[00113] A full test run of an embodiment of the system was performed using adult mosquitoes. Of the 2,636 total pupae sorted in four rounds of sieving, 1,227 males and 14 females were placed into the collection (emergence) chamber 30 to be sorted by the system. A total of 916 males (75%) and 4 females (30%) passed through the behavioural sorting system 50 within 180 minutes and into the laser targeting area 40. All four females that made it through the behavioural sorting system were identified and immobilized by the laser (e.g.,100% success rate).

[00114] Based on trials of the system it is estimated that they system is capable of identifying, classifying and eliminate unwanted female mosquitoes at a sorting accuracy rate (or female contamination rate) of around 1 in 1 million. Use of multiple sequential behavioural sorting chambers can be used to improve the accuracy along with additional image sensors or optimised image processing. Further mosquitoes are relatively fast-moving arthropods and thus represent a significant stress test of the system. Positive results with mosquitoes are suggestive that the system is operating at sufficient speed to process other (slower moving) arthropods.

[00115] Embodiments of the system enable efficient sorting of male from female mosquitoes (or arthropods) by using a series of behavioural and physical characteristic for identification and classification. The system is modular and comprises a sieve module to remove >95% of females from the cohort and a behavioural sorting module that attracts males to a “swarm marker” and plays a female wingbeat tone through a speaker to act as an audio lure which is coordinated with an air flow system such as bladeless fan to suck/pull males into the next sorting cage. The audio lure takes advantage of a difference in the wingbeat frequency of males and females. The exact frequency (or frequency range) for mosquitoes depends upon the species as well as environmental factors such as temperature and humidity but is typically in the range of 100Hz to 1000Hz. The specific frequency can thus be selected based on specific species and environmental conditions. In one embodiment a female wing beat tone of 480Hz is used. This module achieved accuracies of around 99%. In other embodiments other lures including visual lures, chemical attractants and blood feeding pads may be used. [00116] The individual module comprises many advantageous features and may be provided as standalone units or modules, or systems may be provided as various combinations of the modules described above, including the embodiment shown in Figure 1 which both rears arthropods and then sex sorts the emergent arthropods. For example, in one embodiment the system may comprise a sieving module and a behavioural sorting module comprising multiple behavioural sorting chambers arranged in sequence with an air flow system. In this embodiment each sorting chamber multiplicatively reduces the target sex. In some embodiments the system may comprise a behavioural sorting module 40 and a computer vision based tracking and targeting module 70 which when combined, significantly improve performance of embodiments of the system with only one of the two modules.

[00117] In some embodiment the system 1 comprises an integrated sex sorting apparatus comprising the behavioural sorting module 40, computer vision based tracking and targeting module 70, including the computing apparatus 60 and directed energy module 80, and the final collection module 98 with integrated air flow system 91. In some embodiments, a sex sorting apparatus is an integrated apparatus as illustrated in Figures 8A and 8B. In this embodiment the behavioural sorting module 40, and the final collection module 98 are both connected to a housing 88 containing the computer vision based tracking and targeting module 70 (including the computing apparatus 60 and directed energy module 80) via an integrated air flow system comprised of tubing and fans 92. A container containing arthropods is connected to the behavioural sorting module to acts as a supply source of arthropods to be sorted.

[00118] In this embodiment the walls of the housing 88 are shown as laser safe translucent walls to show internal components but could be formed of opaque materials such as sheet metal. The entry aperture 73 and exit aperture 74 of the targeting chamber 72 are formed as tubular interface structures, for example as injection moulded parts, which project through corresponding apertures in the side walls of the housing for connection to the tubing of the air flow system. Tubular piping with an integrated fan 92 connects the exit aperture 52 of the behavioural sorting module 40 to the entry aperture 73 of the targeting chamber. Similarly tubular piping connects the exit aperture 74 of the targeting chamber to the final collection module 98. In this embodiment the collection chamber 98 is a removable translucent or transparent chamber which is removably mounted to a base section of the tubular piping in a transparent or translucent section, followed by a fan 92 section. A gate arrangement may be placed in the tube near the entrance of the collection chamber. This may be manual or automatic and may be used to direct dead arthropods down into the collection chamber under the influence of the air flow system, or to close the entrance when the collection chamber is removed and emptied.

[00119] Figure 8B is schematic diagram of an internal frame of the housing shown in Figure 8A. The internal frame comprises a base portion 88a and a rear portion 88b with the chamber 72 supported at the intersection of the base and rear portions. The frame may be constructed of T-slot railing included extmded railings to provide flexibility in mounting locations, although other structural members such as tubes or rods may be used. A first camera 76 is mounted to a front portion of the base portion and a second camera 78 is mounted to an upper portion of the rear portion. The base portion 88a comprises two pairs of legs which support side rails to elevate the first camera 76 above the bottom surface of the housing, on which is mounted the laser 80 and targeting apparatus (e.g., galvanometer 82). The computer apparatus 60 may be mounted to the rear of the rear frame portion 88b.

[00120] In this embodiment two calibration plates 72a and 72b are slidably mounted respectively to the rails of the rear and base portions. The slidable mountings allow each calibration plate to move from a retracted storage location to a calibration location behind a respective wall of the chamber. First calibration plate 72a slides down to a location behind the rear wall of the chamber 72 such that it is visible in the field of view of the first camera 76, and the second calibration plate 72b slides along the base rails to a location under the base wall of the chamber 72 that that it is visible in the field of view of second camera 78. The slidable mountings may allow manual movement of the calibration plates or may be automated using a belt drive or pneumatic actuators. In other embodiments the calibration plates could be folded out to be located in front of the front and top panels of the targeting chamber. The calibration plates comprise a known calibration pattern 72b' of rectangular grid lines of known (or predetermined) dimensions such as 10cm squares. Images captured with the calibration plates can be used with knowledge of the location of the plates and the calibration pattern to calibrate the three dimensional geometry of the target chamber 72 to assist in determining the three dimensional positions of arthropods detected in images.

[00121] Figures 9A, 9B, and 9C are schematic diagram showing three embodiments of a modular integrated sex sorting apparatus. In Figure 9A, the housing is made of sheet metal with a hinged access portion incorporating a laser safe viewing window in the front surface made of a translucent laser safe material. A light is located on a top surface to indicate when the system is in use and/or when the laser is being fired. An internal frame, such as illustrated in Figure 8B, may be used to support the cameras, chamber, targeting system and computers. In this embodiment the hinged portion provides access to base portion 88a of the frame which supports and houses the cameras, targeting system and chamber, and side access panels provide access to the rear of the housing which houses the computing apparatus 60 mounted to the rear portion 88b of the frame. A support is provided which supports the external components such as the tubing, fan 92, collection chamber 96 and final collection module 98. Figure 9B is schematic diagram of a modular integrated sex sorting apparatus according to another embodiment. In this embodiment a double hinged access door is provided in the front panel to allow access to the cameras, chamber, targeting system, and the computing apparatus 60 is mounted in the base portion which access provided via drawers. Figure 9C is schematic diagram of a modular integrated sex sorting apparatus according to another embodiment. This system is a fully enclosed system which hides the tubing of the air flow system. A rolling access panel is provided in the front and side portion of the housing 88. The computer 80 is a side mounted computer located adjacent to the door. Handles may be provided for ease of movement.

[00122] The connections of the air flow tubing to the entry aperture 73 and exit aperture 74 may be removable connections such as by using a clamping arrangement (i.e. screw adjustable band), a snap fit arrangement (i.e. one part has a annular ridge and the matching other part snap fits over the ridge), or a keyed arrangement (i.e. key on one component and matching retaining slot on the other part). A gasket may be used to provide an air seal between the parts. In other embodiments the connection may be a permanent connection which may be formed by gluing or welding the components together. In one embodiment the tubing has a 90mm diameter, although other suitable diameters (e.g., 50, 70, 120 or 150mm) which allow arthropods to move through the system may be used. Vibration damping connections may also be provided in the tubing on one or both sides of a fan module 92. The cameras may be mounted on isolated mounts, for example using rubber or soft spacer elements. A separate air compressor may be used to provide the directed air for the air flow system and though isolation reduce vibration in the system.

[00123] The chamber may be a rectangular chamber formed of flat panels connected at the edges, for example by welding or gluing, or it may be formed as an extruded box. In one embodiment the targeting chamber panels are 103mm by 197mm, although it will be understood that other sizes could be used (larger or smaller). Other geometries could also be used, such as an extruded tube or a chamber with flat front and top panels connected by a curved rear panel. This is illustrated in Figure 10 which flat top panel 72c and front panel 72d with a curved rear panel 72e. In this embodiment a curved LED panel 72f located behind the curved rear panel 72e provide light into chamber 72.

[00124] As discussed above, arthropods are collected in a container which is connected to the behavioural sorting module 40. The arthropods may be collected by an arthropod collection apparatus. The arthropod collection apparatus may be directly connected to the behavioural sorting module 40 to provide arthropods for sorting, optionally with a controllable barrier, such as door or gate apparatus (manual or automatic), to control when arthropods are allowed to enter the behavioural sorting module 40. Alternatively, the arthropod collection apparatus may collect the arthropods into a container which is then temporarily connected to the behavioural sorting module 40, or the arthropod collection apparatus may collect the arthropods, and these are then transferred to a transport (or transfer) container which is transported to, and then connected to, the behavioural sorting module 40. This allows collection, whether by capture or rearing, to be performed in a location different from the location of the sex sorting apparatus. In some embodiments the arthropod collection apparatus may be an integrated apparatus comprising a rearing module 10, the sieving module 20, and a collection (emergence) container 30. In other embodiments the arthropod collection apparatus may be an arthropod (or insect) trap which collects the arthropods into a collection container. Other the arthropod collection apparatus may be used including a combination of traps and rearing systems, or multiple rearing systems. In some embodiments the collection (emergence) container 30 may be disconnected from the sieving module and used to transport the arthropods to the behavioural sorting module 40 (where it is connected). In some embodiments the arthropods are reared in batches, and the collection (emergence) container 30 is connected, or a door or gating apparatus is opened, when the arthropods are expected to emerge, or as discussed above, for some predefined time period around emergence (e.g., days 9-12). The door or gate apparatus may be manual or automatically opened and closed for example based on the predefined time period around emergence.

[00125] The above embodiments have been focussed on sorting of mosquitoes, but it will be understood that sorting of other arthropods including flies (including fmit flies), moths and beetles. It is noted that the system can also perform real-time tracking of crawling or moving arthropods. Both the sieving and 2D identification (classification) and 3D tracking systems take into account morphological differences between arthropods of different sexes. In some embodiments the 3D tracking system is further configured to tracking behavioural differences of arthropods within the target chamber 72. For example, the dimensions and arrangement of slots in the sieves, including the use of grooved slots may preferentially allow males through the sieve. Similarly, the image classified may be trained to recognise the morphological differences between different sexes. Timing of sieving may also be selected to exploit sex related timing difference of when pupae emerge. The behavioural sorting system also takes account of behavioural differences between sexes by providing a swarm marker and an audio lure which is synchronised with an air flow system. The air flow system creates directional flow through the system and can be used in combination with other chemical lures. The use of bladeless fans reduces the risk of harm to mosquitoes passing through the system (thus increasing the fitness of individuals collected in the final chamber 98)

[00126] The system may be implemented using one or more computing apparatus as described herein. The computing apparatus may comprise one or more processors including multi-core CPUs and Graphical Processing Units (GPUs) operatively connected to one or more memories which store instructions to configure the processor to perform embodiments of the method. In this context, the computing system may include, for example, one or more processors (CPUs, GPUs), memories, storage, and input/output devices (e.g., monitor, keyboard, disk drive, network interface, internet connection, etc.). However, the computing apparatus may include circuitry or other specialized hardware for carrying out some or all aspects of the processes. The computing apparatus may be an all-in-one computer, a desktop computer, a laptop, a tablet, a mobile computing apparatus, a server, a microcontroller, or a microprocessor board, and any associated peripheral devices. The computer apparatus may be a distributed system including server based systems and cloud-based computing systems. The computing apparatus may be a unitary computing or programmable device, or a distributed system or device comprising several components operatively (or functionally) connected via wired or wireless connections. In some operational settings, the computing system may be configured as a system that includes one or more devices, each of which is configured to carry out some aspects of the processes either in software, hardware, or some combination thereof. For example, a microcontroller or microprocessor board (e.g., an Arduino board) may be used to control the audio lure, fans, and blower, whilst a separate image processing desktop incorporating GPUs may be used to perform the image processing, classification, tracking and targeting. A user interface may be provided on another computing apparatus such as a laptop which interfaces with the microcontroller and image processing desktop to allow a user to interact, monitor and configure the system. The user interface may be provided as a web portal or interface allowing a user to remotely interact, monitor and configure the system. In some embodiments data processing may be performed remotely on a server based system including cloud based server systems, and the user interface is configured to communicate with such servers to exchange data and results.

[00127] An embodiment of a computing apparatus 60 is illustrated in Figure 1 A and comprises a central processing unit (CPU) 61, a memory 62, and may include a GPU 63, an output device 64 such as a display apparatus, and/or an input device 65 such as keyboard, mouse, etc. The display apparatus may be a touch screen which also acts as an input device. The CPU 61 comprises an Input/Output Interface, an Arithmetic and Logic Unit (ALU) and a Control Unit and Program Counter element which is in communication with input and output devices (e.g., input device 63 and display apparatus 65) through the Input/Output Interface. The Input/Output Interface may comprise a network interface and/or communications module for communicating with an equivalent communications module in another device using a predefined communications protocol (e.g., Bluetooth, Zigbee, IEEE 802.15, IEEE 802.11, TCP/IP, UDP, etc.).

[00128] A graphical processing unit (GPU) 63 may also be included. The display apparatus may comprise a flat screen display (e.g., LCD, LED, plasma, touch screen, etc.), a projector, CRT, etc. The computing device may comprise a single CPU (core) or multiple CPU’s (multiple core), or multiple processors. The computing device may use a parallel processor, a vector processor, or be a distributed computing device. The memory is operatively coupled to the processor(s) and may comprise RAM and ROM components and may be provided within or external to the device. The memory may be used to store the operating system and additional software modules or instructions. The processor(s) may be configured to load and executed the software modules or instructions stored in the memory. In one embodiment the computing system an embedded Al system similar to a NVIDA Jetson.

[00129] A computer program may be written, for example, in a general-purpose programming language (e.g., Python, Java, C++, C, C# etc.) or some specialized application-specific language, and may utilise or call software libraries or packages for example to implement data interfaces (e.g. JSON) or utilise machine learning (e.g. TensorFlow, CUD A). [00130] Those of skill in the art would understand that information and signals may be represented using any of a variety of technologies and techniques. For example, data, instructions, commands, information, signals, bits, symbols, and chips may be referenced throughout the above description may be represented by voltages, currents, electromagnetic waves, magnetic fields or particles, optical fields or particles, or any combination thereof.

[00131] Those of skill in the art would further appreciate that the various illustrative logical blocks, modules, circuits, and algorithm steps described in connection with the embodiments disclosed herein may be implemented as electronic hardware, computer software or instructions, middleware, platforms, or combinations of both. To clearly illustrate this interchangeability of hardware and software, various illustrative components, blocks, modules, circuits, and steps have been described above generally in terms of their functionality. Whether such functionality is implemented as hardware or software depends upon the particular application and design constraints imposed on the overall system. Skilled artisans may implement the described functionality in varying ways for each particular application, but such implementation decisions should not be interpreted as causing a departure from the scope of the present disclosure.

[00132] The steps of a method or algorithm described in connection with the embodiments disclosed herein may be embodied directly in hardware, in a software module executed by a processor, or in a combination of the two, including cloud based systems. For a hardware implementation, processing may be implemented within one or more application specific integrated circuits (ASICs), digital signal processors (DSPs), digital signal processing devices (DSPDs), programmable logic devices (PLDs), field programmable gate arrays (FPGAs), processors, controllers, micro-controllers, microprocessors, or other electronic units designed to perform the functions described herein, or a combination thereof. Various middleware and computing platforms may be used.

[00133] In some embodiments the processor module comprises one or more Central Processing Units (CPUs) or Graphical processing units (GPU) configured to perform some of the steps of the methods. Similarly, a computing apparatus may comprise one or more CPUs and/or GPUs. A CPU may comprise an Input/Output Interface, an Arithmetic and Logic Unit (ALU) and a Control Unit and Program Counter element which is in communication with input and output devices through the Input/Output Interface. The Input/Output Interface may comprise a network interface and/or communications module for communicating with an equivalent communications module in another device using a predefined communications protocol (e.g., Bluetooth, Zigbee, IEEE 802.15, IEEE 802.11, TCP/IP, UDP, etc.). The computing apparatus may comprise a single CPU (core) or multiple CPU’s (multiple core), or multiple processors. The computing apparatus may be a cloud based computing apparatus using GPU clusters, a parallel processor, a vector processor, or be a distributed computing device. Memory is operatively coupled to the processor(s) and may comprise RAM and ROM components and may be provided within or external to the device or processor module. The memory may be used to store an operating system and additional software modules or instructions. The processor^) may be configured to load and executed the software modules or instructions stored in the memory.

[00134] Software modules, also known as computer programs, computer codes, or instructions, may contain a number a number of source code or object code segments or instructions, and may reside in any computer readable medium such as a RAM memory, flash memory, ROM memory, EPROM memory, registers, hard disk, a removable disk, a CD-ROM, a DVD-ROM, a Blu-ray disc, or any other form of computer readable medium. In some aspects the computer-readable media may comprise non- transitory computer-readable media (e.g., tangible media). In addition, for other aspects computer- readable media may comprise transitory computer- readable media (e.g., a signal). Combinations of the above should also be included within the scope of computer-readable media. In another aspect, the computer readable medium may be integral to the processor. The processor and the computer readable medium may reside in an ASIC or related device. The software codes may be stored in a memory unit and the processor may be configured to execute them. The memory unit may be implemented within the processor or external to the processor, in which case it can be communicatively coupled to the processor via various means as is known in the art.

[00135] Further, it should be appreciated that modules and/or other appropriate means for performing the methods and techniques described herein can be downloaded and/or otherwise obtained by computing device. For example, such a device can be coupled to a server to facilitate the transfer of means for performing the methods described herein. Alternatively, various methods described herein can be provided via storage means (e.g., RAM, ROM, a physical storage medium such as a compact disc (CD) or floppy disk, etc.), such that a computing device can obtain the various methods upon coupling or providing the storage means to the device. Moreover, any other suitable technique for providing the methods and techniques described herein to a device can be utilized.

[00136] The methods disclosed herein comprise one or more steps or actions for achieving the described method. The method steps and/or actions may be interchanged with one another without departing from the scope of the claims. In other words, unless a specific order of steps or actions is specified, the order and/or use of specific steps and/or actions may be modified without departing from the scope of the claims.

[00137] As used herein, the term “determining” encompasses a wide variety of actions. For example, “determining” may include calculating, computing, processing, deriving, investigating, looking up (e.g., looking up in a table, a database or another data structure), ascertaining and the like. Also, “determining” may include receiving (e.g., receiving information), accessing (e.g., accessing data in a memory) and the like. [00138] The reference to any prior art in this specification is not, and should not be taken as, an acknowledgement or any form of suggestion that such prior art forms part of the common general knowledge.

[00139] It will be understood that the terms “comprise” and “include” and any of their derivatives (e.g. comprises, comprising, includes, including) as used in this specification, and the claims that follow, is to be taken to be inclusive of features to which the term refers, and is not meant to exclude the presence of any additional features unless otherwise stated or implied.

[00140] In some cases, a single embodiment may, for succinctness and/or to assist in understanding the scope of the disclosure, combine multiple features. It is to be understood that in such a case, these multiple features may be provided separately (in separate embodiments), or in any other suitable combination. Alternatively, where separate features are described in separate embodiments, these separate features may be combined into a single embodiment unless otherwise stated or implied. This also applies to the claims which can be recombined in any combination. That is a claim may be amended to include a feature defined in any other claim. Further a phrase referring to “at least one of’ a list of items refers to any combination of those items, including single members. As an example, “at least one of: a, b, or c” is intended to cover: a, b, c, a-b, a-c, b-c, and a-b-c.

[00141] It will be appreciated by those skilled in the art that the disclosure is not restricted in its use to the particular application or applications described. Neither is the present disclosure restricted in its preferred embodiment with regard to the particular elements and/or features described or depicted herein. It will be appreciated that the disclosure is not limited to the embodiment or embodiments disclosed, but is capable of numerous rearrangements, modifications and substitutions without departing from the scope as set forth and defined by the following claims.