Login| Sign Up| Help| Contact|

Patent Searching and Data


Title:
REAL-TIME NEURAL MONITOR AND ANALYZER
Document Type and Number:
WIPO Patent Application WO/2016/112372
Kind Code:
A1
Abstract:
A real-time neural monitor and analyzer of neural activity and a method of using the real-time neural monitor as part of a treatment regime to reduce neurological injury in a subject in need thereof or a method of using the real-time neural monitor to identifying subjects in need of a treatment to prevent and/or reduce neuronal injury and/or onset of a seizure.

Inventors:
SHARPE JAMES MARK (US)
POULSEN DAVID J (US)
PHILP JAMES ALEXANDER (US)
Application Number:
PCT/US2016/012773
Publication Date:
July 14, 2016
Filing Date:
January 09, 2016
Export Citation:
Click for automatic bibliography generation   Help
Assignee:
UNIV MONTANA (US)
International Classes:
A61N1/02
Foreign References:
US20100280334A12010-11-04
US20120226187A12012-09-06
US20130030486A12013-01-31
US20110191047A12011-08-04
US20090327172A12009-12-31
Attorney, Agent or Firm:
FULLER, Rodney J. (PLC1255 West Rio Salado Parkway,Suite 21, Tempe Arizona, US)
Download PDF:
Claims:
CLAIMS

We claim:

1. A computer-implemented method of identifying anomalous neural activity, comprising:

detecting an event from neural activity of a brain;

calculating a first feature vector by applying a first algorithm to the event; calculating a second feature vector by applying a second algorithm to the event;

calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector; and

identifying anomalous neural activity based at least in part on the unified feature

vector.

2. A computer-implemented method of identifying anomalous neural activity, comprising:

detecting an event from neural activity of a brain;

calculating a first feature vector by applying a first algorithm to the event; calculating a second feature vector by applying a second algorithm to the event;

calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector;

classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k-means clustering, support vector machines, artificial neural networks, or Bayesian networks; characterizing the unified feature vector based at least in part on the classifying of the unified feature vector; and identifying anomalous neural activity based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector.

3. The method of claim 1 or 2, further comprising:

measuring the neural activity of the brain before detecting the event, wherein the anomalous neural activity is identified within 50 microseconds of measuring the neural activity.

4. The method of claim 1 or 2, further comprising:

measuring the neural activity of the brain at a sampling rate before detecting the

event, wherein a time duration between measuring the neural activity and identifying the anomalous neural activity is at least as fast as the sampling rate.

5. The method of any one of the preceding claims, wherein the first algorithm comprises a supervised machine learning algorithm and the second algorithm comprises an unsupervised machine learning algorithm.

6. The method of any one of the preceding claims, wherein the first algorithm comprises a first cluster analysis algorithm.

7. The method of any one of the preceding claims, wherein the second algorithm is chosen from the group consisting of a machine learning algorithm, a second cluster analysis algorithm, and a cepstrum algorithm.

8. The method of any one of the preceding claims, wherein at least the first or second

algorithm operates on a combination of discrete wavelet transform coefficients of the event and calculated metrics of the event, wherein the calculated metrics of the event comprise RMS value, peak-peak range, entropy, and relative energy for each wavelet band. The method of any one of the preceding claims, wherein the unified feature vector is calculated by normalizing and concatenating the first feature vector and the second feature vector and by weighting the first feature vector and the second feature vector unequally.

The method of any one of the preceding claims, further comprising:

calculating event metadata for the event, wherein event metadata comprises one or more of: a duration of the event, an RMS value, a maximum positive value, a minimum negative value, a ratio between the maximum positive value and the minimum negative value, a ratio between a total power of frequency bands, or a time since a last recorded event on a same channel.

The method of any one of claims 1 or 3-10, further comprising:

classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k-means clustering, support vector machines, artificial neural networks, or Bayesian networks.

The method of any one of claims 1 or 3-11, further comprising:

characterizing the unified feature vector based at least in part on the classifying of the unified feature vector.

The method of any one of claims 1 or 3-12, wherein identifying anomalous neural activity is based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector.

The method of any one of the preceding claims, further comprising:

enriching the unified feature vector to mathematically or visually more clearly

identify distinguishing features of the event. The method of any one of the preceding claims, wherein the measuring the neural activity is measured by one or more of: EEG electrodes, subdural electrodes, or deep brain electrodes.

The method of any one of the preceding claims, wherein the identifying anomalous neural activity is performed on an ambulatory device.

A computer-readable medium including contents that are configured to cause a computing system to identify anomalous neural activity by performing a method comprising:

detecting an event from neural activity of a brain;

calculating a first feature vector by applying a first algorithm to the event;

calculating a second feature vector by applying a second algorithm to the event; calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector;

classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k-means clustering, support vector machines, artificial neural networks, or Bayesian networks; and

identifying anomalous neural activity based at least in part on the classifying of the unified feature vector.

The computer-readable medium of claim 17, wherein at least the first or second algorithm operates on a combination of discrete wavelet transform coefficients of the event and calculated metrics of the event, wherein the calculated metrics of the event comprise RMS value, peak-peak range, entropy, and relative energy for each wavelet band.

19. The computer-readable medium of claims 17 or 18, wherein the first algorithm comprises a supervised machine learning algorithm and the second algorithm comprises an unsupervised machine learning algorithm.

20. The computer-readable medium of any one of claims 17-19, wherein the first algorithm comprises a first cluster analysis algorithm.

21. The computer-readable medium of any one of claims 17-20, wherein the second

algorithm is chosen from the group consisting of a machine learning algorithm, a second cluster analysis algorithm, and a cepstrum algorithm.

22. The computer-readable medium of any one of claims 17-21, further comprising:

characterizing the unified feature vector based at least in part on the classifying of the unified feature vector.

23. The computer-readable medium of any one of claims 17-22, wherein identifying

anomalous neural activity is based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector.

Description:
REAL-TIME NEURAL MONITOR AND ANALYZER

RELATED APPLICATION

[0001] This application claims the benefit of U. S. Provisional Application No. 62/101,896, filed January 9, 2015, titled "Real-time Neural Monitor and Analyzer," the contents of which are herein expressly incorporated by reference for all purposes.

TECHNICAL FIELD

[0002] The invention relates to systems and methods of analyzing neural activity to monitor or treat patients having suffered a neurological injury.

BACKGROUND

[0003] Electroencephalography ("EEG") is one of various methods of measuring neural activities of humans and sometimes animals (e.g., in lab settings). EEG measures electrical potential differences of nearby neurons and is used to help medical professionals understand the state of health of a patient. Unfortunately, EEG and other methods of measuring neural activities tend to generate a tremendous amount of data, which only increases as more electrodes and channels are added and as sampling rates increase. Systems exist to analyze EEG, such as seizure detection or prediction systems, but these systems are frequently slow in handling the large amount of data generated. Moreover, the analyzed data is often very complex and difficult to understand without specialized training. [0004] Applicant believes that the material incorporated above is "non-essential" in accordance with 37 CFR 1.57, because it is referred to for purposes of indicating the background of the inventions or illustrating the state of the art. However, if the Examiner believes that any of the above-incorporated material constitutes "essential material" within the meaning of 37 CFR 1.57(c)(l)-(3), applicant will amend the specification to expressly recite the essential material that is incorporated by reference as allowed by the applicable rules.

SUMMARY

[0005] The present disclosure provides among other things a computer-implemented method of identifying anomalous neural activity. The method may comprise detecting an event from neural activity of a brain; calculating a first feature vector by applying a first algorithm to the event; calculating a second feature vector by applying a second algorithm to the event; calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector; and identifying anomalous neural activity based at least in part on the unified feature vector.

[0006] According to one aspect, a computer-implemented method of identifying anomalous neural activity can comprise detecting an event from neural activity of a brain; calculating a first feature vector by applying a first algorithm to the event; calculating a second feature vector by applying a second algorithm to the event; calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector; classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k-means clustering, support vector machines, artificial neural networks, or Bayesian networks; characterizing the unified feature vector based at least in part on the classifying of the unified feature vector; and identifying anomalous neural activity based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector.

[0007] According to some aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: measuring the neural activity of the brain before detecting the event, wherein the anomalous neural activity is identified within 50 microseconds of measuring the neural activity. [0008] According to certain aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: measuring the neural activity of the brain at a sampling rate before detecting the event, wherein a time duration between measuring the neural activity and identifying the anomalous neural activity is at least as fast as the sampling rate.

[0009] In some aspects, the first algorithm comprises a supervised machine learning algorithm and the second algorithm comprises an unsupervised machine learning algorithm. In certain aspects, the first algorithm comprises a first cluster analysis algorithm. In some aspects, the second algorithm is chosen from the group consisting of a machine learning algorithm, a second cluster analysis algorithm, and a cepstrum algorithm. I various aspects, at least the first or second algorithm operates on a combination of discrete wavelet transform coefficients of the event and calculated metrics of the event, wherein the calculated metrics of the event comprise RMS value, peak-peak range, entropy, and relative energy for each wavelet band.

[0010] According to certain aspects, the unified feature vector is calculated by normalizing and concatenating the first feature vector and the second feature vector and by weighting the first feature vector and the second feature vector unequally.

[0011] According to some aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: calculating event metadata for the event, wherein event metadata comprises one or more of: a duration of the event, an RMS value, a maximum positive value, a minimum negative value, a ratio between the maximum positive value and the minimum negative value, a ratio between a total power of frequency bands, or a time since a last recorded event on a same channel.

[0012] According to various aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k- means clustering, support vector machines, artificial neural networks, or Bayesian networks.

[0013] According to certain aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: characterizing the unified feature vector based at least in part on the classifying of the unified feature vector.

[0014] In some aspects, identifying anomalous neural activity is based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector. In certain aspects, the measuring the neural activity is measured by one or more of: EEG electrodes, subdural electrodes, or deep brain electrodes. In some aspects, the identifying anomalous neural activity is performed on an ambulatory device.

[0015] According to some aspects, a computer-implemented method of identifying anomalous neural activity can further comprise: enriching the unified feature vector to mathematically or visually more clearly identify distinguishing features of the event.

[0016] According to one aspect, a computer-readable medium including contents that are configured to cause a computing system to identify anomalous neural activity by performing a method can comprise: detecting an event from neural activity of a brain; calculating a first feature vector by applying a first algorithm to the event; calculating a second feature vector by applying a second algorithm to the event; calculating, for the event, a unified feature vector based at least in part on the first feature vector and the second feature vector; classifying the unified feature vector into one or more categories using one or more of the following types of algorithms: cluster analysis, k-means clustering, support vector machines, artificial neural networks, or Bayesian networks; and identifying anomalous neural activity based at least in part on the classifying of the unified feature vector. [0017] In some aspects, at least the first or second algorithm operates on a combination of discrete wavelet transform coefficients of the event and calculated metrics of the event, wherein the calculated metrics of the event comprise RMS value, peak-peak range, entropy, and relative energy for each wavelet band. In some aspects, the first algorithm comprises a supervised machine learning algorithm and the second algorithm comprises an unsupervised machine learning algorithm. In various aspects, the first algorithm comprises a first cluster analysis algorithm. In other aspects, the second algorithm is chosen from the group consisting of a machine learning algorithm, a second cluster analysis algorithm, and a cepstrum algorithm. In some aspects, identifying anomalous neural activity is based at least in part on the classifying of the unified feature vector and the characterizing of the unified feature vector.

[0018] According to some aspects, a computer-readable medium including contents that are configured to cause a computing system to identify anomalous neural activity by performing a method can further comprise: characterizing the unified feature vector based at least in part on the classifying of the unified feature vector.

[0019] Aspects and applications of the disclosure are described below with reference to the Drawings and the Detailed Description. Unless specifically noted, it is intended that the words and phrases in the specification and the claims be given their plain, ordinary, and accustomed meaning to those of ordinary skill in the applicable arts. The inventors are fully aware that they can be their own lexicographers if desired. The inventors expressly elect, as their own lexicographers, to use only the plain and ordinary meaning of terms in the specification and claims unless they clearly state otherwise and then further, expressly set forth the "special" definition of that term and explain how it differs from the plain and ordinary meaning. Absent such clear statements of intent to apply a "special" definition, it is the inventors' intent and desire that the simple, plain and ordinary meaning to the terms be applied to the interpretation of the specification and claims.

[0020] The inventors are also aware of the normal precepts of English grammar. Thus, if a noun, term, or phrase is intended to be further characterized, specified, or narrowed in some way, then such noun, term, or phrase will expressly include additional adjectives, descriptive terms, or other modifiers in accordance with the normal precepts of English grammar. Absent the use of such adjectives, descriptive terms, or modifiers, it is the intent that such nouns, terms, or phrases be given their plain, and ordinary English meaning to those skilled in the applicable arts as set forth above.

BRIEF DESCRIPTION OF THE DRAWINGS

[0021] A more complete understanding of the present invention may be derived by referring to the detailed description when considered in connection with the following illustrative figures. In the figures, like reference numbers refer to like elements or acts throughout the figures.

[0022] FIGs. 1 A-1B depict a neural monitoring and analyzing system.

[0023] FIG. 2 depicts a method of analyzing neural activity.

[0024] Elements and acts in the figures are illustrated for simplicity and have not necessarily been rendered according to any particular sequence or embodiment.

DETAILED DESCRIPTION

[0025] In the following description, and for the purposes of explanation, numerous specific details are set forth in order to provide a thorough understanding of the various aspects of the disclosed embodiments. It will be understood, however, by those skilled in the relevant arts, that the embodiments may be practiced without these specific details. In other instances, known structures and devices are shown or discussed more generally in order to avoid obscuring the discussion of the various embodiments. In many cases, a description of the operation is sufficient to enable one to implement the various forms of the embodiments, particularly when the operation is to be implemented in software. It should be noted that there are many different and alternative configurations, devices and technologies to which the disclosed embodiments may be applied. The full scope of the embodiments is not limited to the examples that are described below.

[0026] FIGs. 1A-1B illustrate various embodiments of a neural monitoring and analyzing system 100 capable of real-time analysis of neural activity. FIG 1A depicts a subject 102 having electrodes 110 connected near neurons of subject 102 to be monitored and measured by recorder 116. Analyzer 130 receives output from recorder 116 and analyzes the neural activity of subject 102 by identifying and analyzing neurological events of physiological importance (which includes both physiological and pathological neural activity). The analysis of subject 102 is monitored by user 152 who utilizes a computer system 150 configured to communicate with and instruct analyzer 130. User 152 may be a medical professional, such as a doctor, nurse, technician, medic, assistant, or the like. The neural activity of a subject 102 having suffered an injury is monitored with the goal of understanding or diagnosing effects of the injury, identifying risks of future injury, or providing treatment to subject 102 to improve health or limit further injury. The subject 102 can be treated with numerous therapies (e.g., drugs, surgery, neurostimulation, etc.) as illustrated by treatment feedback 160 and automatic treatment feedback 162. Automatic treatment feedback 162 indicates a treatment provided automatically to subject 102 based at least in part by the function of analyzer 130. For example, treatment feedback 162 may involve a medical professional administering a drug to subject 102, or automatic treatment feedback 162 may involve automated instructions to provide subject 102 with neurostimulation or a drug automatically. In addition, analyzer feedback 164 provides user 152 with a way to adjust the analysis of the neural activity of subject 102, thereby customizing analyzer 130 to the needs of subject 102.

[0027] In one embodiment, subject 102 is a human patient who has suffered a neurological insult, such as a traumatic brain injury. Many traumatic brain injuries have a cascading effect of further neurological injuries above and beyond the initial brain injury, such as swelling, infection, "rewiring" of neural pathways, motor dysfunction, neurological disorders, and so forth. In particular, traumatic brain injuries may greatly increase the risk of developing epilepsy. Medical professionals are aware of numerous "signature" electroencephalographic ("EEG") waveforms associated with epilepsy, brain injuries, and other neural pathologies. However, the human brain is very complex and many biomarkers are still not understood or even identified. Neural analyzing system 100 allows user 152 to analyze the neural activity of subject 102 and identify previously identified neurological events, but also investigate potentially relevant neurological events that are still not fully understood. Thus, neural analyzing system 100 can help, for example, to identify anomalous EEG wave patterns that may be predictive indicators to identify whether subject 102 is at risk for developing post-traumatic epilepsy. [0028] FIG. IB illustrates various embodiments and aspects of a neural monitoring and analyzing system 100 capable of real-time analysis of neural activity. At least one pair of neurosensing electrodes 1 10 are placed on subject 102 to measure changes in neuronal activity over time by detecting changing electric potentials near the electrodes 1 10, which is measured as raw data 1 14. Subject 102 is typically a human, but may also be an animal used in studies (e.g., rats, mice, or monkeys). Electrodes 1 10 are capable of sensing the electrical potential of nearby tissue that changes as action potentials of neurons cause small changes in electrical activity. Cutaway 120 of FIG. IB depicts several alternative electrodes 1 10 available for subject 102. Electrodes 1 10 are preferably designed and used to record raw data 1 14 from the brain 122 of subject 102. Electrodes 1 10 may comprise several EEG electrodes 124 positioned on the scalp of subject 102 to measure electrical activity of the brain 122 of subject 102. Alternatively, electrodes 1 10 may be non-EEG electrodes, such as: subdural electrodes (including single, strip, and array subdural electrodes), deep brain electrodes, sphenoidal electrodes, foramen ovale electrodes, nasopharyngeal electrodes, other types of electrocorticography ("ECoG") or intracranial EEG ("iEEG") electrodes, or the like. For example, cut-away 120 depicts a deep brain electrode 126 that penetrates the brain 122 and a subdural electrode 128 positioned on the brain 122 and below the dura mater of subject 102. In alternative embodiments, electrodes 1 10 may be used to measure neuronal activity outside the brain 122 (e.g., the spinal cord or cranial nerves and other peripheral nerves).

[0029] Electrodes 1 10 may be of varying types used at varying locations on subject 102 to measure neuronal electrical activity as raw data 1 14. Because individual neurons produce very small action potentials, electrodes 1 10 generally detect the electrical activity of numerous nearby neurons. The number of neurons measured by electrodes 1 10 depend on, for example, the placement, size, and shape of electrodes 1 10. When electrodes 1 10 are large, for example, they will tend to measure the electrical activity of more neurons than if small electrodes 1 10 are used. However, muscle cells and endocrine cells also produce action potentials which would be measured by nearby electrodes 1 10. Electrodes 1 10 are preferably of a size and shape and located on or in the subj ect 102 to maximize the neuronal activity recorded in raw data 1 14 and reduce non-targeted electrical activity such as muscle artifacts. Generally, smaller electrodes 1 10 located closer to targeted neurons and away from muscles produce more useful raw data 1 14.

[0030] Raw data 1 14 is the measured electrical potential difference between at least two different electrodes from the electrodes 1 10. A single channel of raw data 1 14 results by measuring the electrical potential difference over time between two electrodes. For example, raw data 114 may be the measured voltage of each channel from EEG electrodes 124 measured in microvolts across time. Depending on the type, number, placement, configuration, and montage of electrodes 1 10, raw data 1 14 may comprise numerous channels of measured neuronal electrical activity data. For example, raw data 1 14 from EEG electrodes 124 will often utilize over 10 channels (e.g., 16, 20, 24, 32, or 64 channels), but deep brain electrodes 126 generally have fewer channels (e.g., 1 to 16 channels).

[0031] A recorder 1 16 receives raw data 1 14 as an input from electrodes 1 10 via connection 1 12. Recorder 1 16 may receive just one channel of raw data 1 14 via connection 1 12, but preferably receives numerous channels of raw data 1 14 corresponding to numerous electrodes 1 10. Recorder 1 16 measures and records the raw data 1 14, then collates the multiple channels of raw data 114 into a format useable by the analyzer 130. Recorder 1 16 may filter or otherwise process raw data 1 14, or it may skip such processes (e.g., where the analyzer 130 is programmed to filter raw data 114). In various other embodiments, recorder 116 is located within analyzer 130, at or near the electrodes 110, or attached to the body or clothing of subject 102. In some embodiments, recorder 116 is sufficiently small and has sufficient battery power to be ambulatory by being attached to body or clothing of subject 102 or being implanted in the body of subject 102. In certain embodiments, recorder 116 receives additional data inputs regarding subject 102, such as: streaming video, sound, respiration, cardiac activity, pulse oximetry, and other measured body function or behavior.

[0032] An analyzer 130 receives raw data 114 from recorder 116 via connection 118. Analyzer 130 comprises software, hardware, and/or firmware capable of analyzing the neural activity of subject 102 measured as raw data 114 with the goal of identifying physiologically relevant events and biomarkers. Raw data 114 that is analyzed at least in part in analyzer 130 is identified as analyzed data 132. Analyzer 130 may perform additional processing on analyzed data 132. Either one or both of connections 112 and 118 are wired or wireless connections. Analyzer 130 may analyze live streaming raw data 114 or historical raw data 114 previously stored in memory, such as in database 140. In one embodiment, analyzer 130 analyzes all live streaming raw data 114 in real-time. In another embodiment, analyzer 130 analyzes only a portion of live streaming raw data 114 in real-time.

[0033] Network 170 connects analyzer 130 to database 140 and optional computer system 150. Database 140 is configured to store data such as raw data 114 and analyzed data 132. In addition, database 140 may interface and communicate with optional computer system 150. Computer system 150 may be used to customize neural activity monitoring algorithms, to view results from the analyzed data 132, to specify the manner of presenting the results from the algorithms, and so forth. Computer system 150 may be configured to be used by user 152, by multiple users, or operate automatically without user intervention.

[0034] In one embodiment, raw data 1 14 is live streaming data and at least some portion of raw data 114 is capable of being analyzed in real-time. Simplistically, "real-time" means that at least some of the raw data 1 14 is analyzed in analyzer 130 before being stored in database 140. That is, raw data 1 14 analyzed in real-time is analyzed sufficiently fast to produce analyzed data 132 without being forced to store the raw data 1 14 in database 140 first. Analyzer 130, therefore, has end-to-end latency sufficiently low to be considered a real-time analyzer. Analyzer 130 utilizes a memory buffer to temporarily store raw data 1 14 awaiting analysis, but the analysis is sufficiently fast to keep up with the large amount of live streaming raw data 1 14, such that the memory buffer is not blown and the raw data is not forced to be stored in database 140 to await analysis. Real-time analysis may, for example, create the analyzed data 132 in less than 50 microseconds after measuring raw data 1 14 or possibly even at a rate faster than the sampling rate of the measured raw data 1 14. In one example, real-time analysis may create the analyzed data 132 in less than 3 seconds after measuring raw data 1 14. In other embodiments, real-time analysis depends on the biomarkers, features or trends within raw data 1 14 being targeted, and might be executed in a matter of minutes instead of seconds (e.g., 1-5 min., 5-15 min., or 30-60 min.). Accordingly, the definition of real-time analysis depends on the questions one is trying to answer by executing algorithms in analyzer 130, with the overall goal of providing a useful answer (via analyzed data 132) within a timeframe expected by a user. More complicated algorithms may take more time to analyze than simple algorithms. In addition, realtime analysis may also comprise generating initial answers with fast and simple algorithms while final answers are generated more slowly using more complicated algorithms. [0035] Fig. 2 illustrates an embodiment of a method for analyzing neural activity. Analyzer 130 contains numerous functions, including ingesting 210 raw data 1 14 from a subject 202 (not shown). Raw data 1 14 may be ingested 210 at a specified sampling rate from electrodes attached to subject 202. Raw data 1 14 may also be ingested 210 from files stored on a computer or on database 140. Raw data 1 14 is optionally preprocessed 220. During feature extraction 230, event detector 232 identifies candidate events 234 from the raw data 1 14 and at least two feature algorithms 236 are applied to each candidate event 234. For each candidate event 234 detected in feature extraction 230, a unified feature vector 236v of n-dimensions is constructed using the feature algorithms 236. Event 235 comprises the candidate event 234 together with the unified feature vector 236v. The detected event 235 or just the unified feature vector 236v may next be optionally enriched 240 and/or sorted 250. The unified feature vectors 236v are classified 270 by at least one classifying algorithm. The events 235 are also characterized 280 to quantify various aspects of the events 235. Database 140 is available to store data and information output from the various functional blocks. Computer system 150 interfaces with database 140 and is operable by user 152.

[0036] Analyzer 130 functions to identify candidate events 234, events 235, and feature vectors 236v. An event 235 is intended to represent a physiological state of relevance to a medical professional. For example, events 235 identify biomarkers indicative of the health of subject 202. Events 235 may correlate to past or present health states, or even be predictive of future health states. Some events 235 may be indicative of a healthy state while other events 235 may be indicative of unhealthy or pathological states. Events 235 may also be declared by analyzer 130 because they match a certain profile even though the significance of the event 235 is not yet understood by user 152. Also, the sensitivity of algorithms (e.g., event detector 232, feature algorithms 236, etc.) employed during feature extraction 230 may often result in events 235 being declared for suspected or potential physiological states. In some embodiments, brain neural activity is analyzed following a brain injury to detect events 235 from EEG data (e.g., spikes) that may be indicators of whether the injured patient is at risk of developing posttraumatic epilepsy. For example, certain spikes in an EEG waveform can assist user 152 in understanding the health of subject 202 because they may be indicative of developing epilepsy or a high propensity for seizures. In some embodiments, the events 235 correlate to a seizure experienced by an epileptic or non-epileptic patient. In certain embodiments an event 235 is declared for each channel of incoming neural activity data, while other embodiments declare events 235 using the data from multiple or all of the channels of neural activity data. In one embodiment, events 235 are grouped by time to prevent duplication of events within a short time period (e.g., two events detected in less than one second are grouped and considered a single event).

[0037] In one embodiment, raw data 114 is optionally preprocessed and/or filtered in preprocessor 220 after being ingested 210. Preprocessing 220 may include filtering out noise, transients, or unwanted frequencies from raw data 114. Preprocessing 220 may also include filtering raw data 114 into one or more frequency bands, such as the common EEG frequency bands of alpha, beta, delta, gamma, and theta. In another embodiment, preprocessing 220 occurs before ingesting 210.

[0038] Feature extraction 230 comprises an event detector 232 and at least two feature algorithms 236 that are applied to raw data 114, which has been ingested 210 and may or may not have been preprocessed 220 as well. Event detector 232 uses one or more heuristics or other algorithms to identify candidate events 234. In some embodiments, event detector 232 uses the magnitude of an EEG signal and the time elapsed since the previous candidate event 234 to declare a new candidate event 234. The feature algorithms 236 (comprising at least a first feature detector algorithm 238 and second feature detector algorithm 239) are configured to calculate a unified feature vector 236v from each candidate event 234. The unified feature vector 236v is designed to be utilized by a machine learning or pattern recognition algorithm. Both first feature detector algorithm 238 and second feature detector algorithm 239 calculate a separate feature vector for the detected events. Thus, first feature detector algorithm 238 calculates a first feature vector 238v and second feature detector algorithm 239 calculates a second feature vector 239v. A feature vector is an n-dimensional vector with numerical features representing some event. As streaming raw data 114 is ingested 210, numerous events 235 and unified feature vectors 236v may be detected (on single or multiple channels). For each event 235, the first and second feature vectors 238v and 239v are calculated, and then are combined into the unified feature vector 236v. The unified feature vector 236v may be combined by equally or unequally weighting the first and second feature vectors 238v and 239v, or portions thereof.

[0039] Feature vectors 238v, 239v, and 236v are each created by performing signal processing calculations on the raw data 114 and concatenating the results into a single structure, called a feature vector. These calculations may include, for example, calculating scalar metrics such as the event duration or the root mean square (RMS) value, calculating Discrete Wavelet Transform (DWT) coefficients, calculating a Fast Fourier Transform (FFT), calculating cepstrum coefficients, and so forth. Thus, n-dimensional feature vectors (238v, 239v, and 236v) can be calculated using at least the raw data 114. [0040] In some embodiments, at least the first or second feature detector algorithms 238 and 239 are an FFT, a DWT, and/or a cepstrum calculation. Thus, first and second feature detector algorithms 238 and 239 calculate their corresponding first and second feature vectors 238v and 239v. First and second feature vectors 238v and 239v are then normalized, concatenated, and combined to create unified feature vector 236v. In some embodiments feature algorithms 236 include more than two feature detector algorithms, but a unified feature vector 236v is still calculated by combining all individual feature vectors for each event 235.

[0041] In addition to the raw data 1 14 and the several calculated feature vectors (e.g., 236v, 238v, and 239v), feature extractor 230 may optionally identify or calculate event metadata 235m. Event metadata 235m may utilize one or more of the disclosed algorithms or other mathematical techniques to generate useful metrics correlated to an event 235. For example, feature extractor 230 may calculate some scalar metrics as metadata 235m, which are not included in feature vector 235 v. Scalar metrics used as metadata 235m may include, for example, event 235 duration, RMS value, maximum positive value, minimum negative value, ratio between the maximum positive value and the minimum negative value, the ratio between total power of frequency bands (e.g., EEG alpha, beta, delta, gamma, or theta), time since last recorded event on the same channel, or the like.

[0042] During classifying 270, one or more supervised machine learning, unsupervised machine learning, or pattern recognition algorithms classify the unified feature vector 236v. The classifying 270 algorithm is one or more of the following types of algorithms: cluster analysis, k- means clustering, support vector machines, artificial neural networks, or Bayesian networks. The classifying 270 algorithm may be linear or non-linear. The classifying 270 process primarily classifies multiple events 235 into distinct categories based on similar features of the events 235. Events 235 may be classified 270 based on the originating channel(s) of the event 235 (e.g., a location on or within brain 122, pairs of electrodes 1 10, or differential mode electrodes 1 10). In some embodiments, the classifying 270 algorithm is a cluster analysis algorithm, such as a k- means clustering algorithm.

[0043] In some embodiments analyzer 130 optionally includes functionality for enriching 240 and/or sorting 250: the raw data 1 14, candidate events 234, events 235, any calculated feature vectors (e.g., first, second, and unified feature vectors 238v, 239v, and 236v), and/or event metadata 235m. Enriching 240 an event 235 may include additional processing to mathematically or visually more clearly identify distinguishing features of the event 235. Enriching 240 may be applied genetically by performing calculations that are not unique to any particular event 235, type of event 235, or raw data 1 14, and may be used to calculate the unified feature vector 236v before being initially classified 270. Enriching 240 may also be applied specifically to events 235 after they have been classified 270 and identified as belonging to a particular type of event. This specific enriching 240 may generate metadata specific to each enriched event. Specific enriching 240 is calculated separate from and subsequent to generic enrichment 240 and classification 270 calculations.

[0044] Sorting 250 comprises sorting the raw data 1 14, candidate events 234, events 235, any calculated feature vectors (e.g., first, second, and unified feature vectors 238v, 239v, and 236v), and/or event metadata 235m. Sorting 250 may include sorting based on the duration of the event 235, the magnitude of the event 235, the type of event 235, or the like.

[0045] Database 140 is configured to store the output of analyzer 130, including, for example, raw data 1 14, candidate events 234, events 235, any calculated feature vectors (e.g., first, second, and unified feature vectors 238v, 239v, and 236v), and/or event metadata 235m. Alternative embodiments store this information in database 140 after completing one or both of the classifying 270 or characterizing 280 processes. Database 140 has the capacity to store large amounts of information and is in communication with computer system 150 operated by user 152.

[0046] The characterization 280 process includes calculations specific to the identified categories of events 235 to quantify aspects of the events 235. Once the type of the event 235 has been determined through the classification 270 process, the characterization 280 process further identifies information of interest for the event 235 with metrics that are specific to that type of event 235. For example, if the nature of the event 235 is a repeating waveform pattern, a specific enrichment 240 might include the number of cycles of the repeating pattern. Each of the type-specific characterizations is generated by one or more custom algorithms designed specifically to calculate that metric. The characterization enrichments can be either a scalar or vector values. As more events 235 are identified and classified 270, a sufficient number of events 235 within at least some categories provide data points that can be characterized 280 by identifying trends, similarities, and differences within and between categories. Characterized 280 data enjoys the benefit that much of this data is easily understood by user 152 and can be readily presented visually on computer system 150.

[0047] In certain embodiments the event detector algorithm 232 and feature detection algorithms 236 are selected based on the speed of processing the raw data 114 such that analyzer 130 analyzes raw data 1 14 in real-time. Certain algorithms are more computationally efficient than others. The event detector algorithm 232 and feature detection algorithms 236 are selected such that the streaming raw data 114 does not back up or blow a memory buffer. [0048] In further embodiments, analyzer 130 is configured to ingest 210 historical raw data 114 recorded previously and stored, for example, on database 140. Many institutions have tremendous amounts of historic raw data 114 stored in databases 140. Historic raw data 114 can be analyzed by analyzer 130 to identify relevant events 235.

[0049] Illustrative embodiments and FIGs. 1A, IB and 2 herein describe one or more computer systems 150. The computer system 150 can include a set of instructions that can be executed to cause the computer system 150 to perform any one or more of the methods or computer based functions disclosed herein. The computer system 150 may operate as a standalone device or may be connected, e.g., using a network, to other computer systems or peripheral devices. In addition, one or more of a mobile computer, a mobile phone, an implanted medical device, or an ambulatory medical device may be implemented as a computer system as described below.

[0050] In a networked deployment, the computer system may operate in the capacity of a server or as a client user computer in a server-client user network environment, or as a peer computer system in a peer-to-peer (or distributed) network environment. The computer system 150 can also be implemented as or incorporated into various devices, such as a personal computer (PC), a tablet PC, a set-top box (STB), a personal digital assistant (PDA), a mobile device, a palmtop computer, a laptop computer, a desktop computer, a communications device, a wireless telephone, a control system, a personal trusted device, a web appliance, a network router, switch or bridge, a mobile computer, or any other machine capable of executing a set of instructions (sequential or otherwise) that specify actions to be taken by that machine. In a particular embodiment, the computer system 150 can be implemented using electronic devices that provide voice, video and/or data communication. Further, while a single computer system 150 is illustrated, the term "system" shall also be taken to include any collection of systems or sub-systems that individually or jointly execute a set, or multiple sets, of instructions to perform one or more computer functions.

[0051] Computer system 150 may include a processor, e.g., a central processing unit (CPU), a graphics processing unit (GPU), or both. Moreover, the computer system 150 can include a main memory and a static memory that can communicate with each other via a bus. As shown, the computer system 150 may further include a video display unit, such as a liquid crystal display (LCD), an organic light emitting diode (OLED), a flat panel display, a solid state display, or a cathode ray tube (CRT). Additionally, the computer system 150 may include an input device, such as a keyboard, and a cursor control device, such as a mouse. The computer system 150 can also include a disk drive unit, a database 140 unit, a signal generation device, such as a speaker or remote control, and a network interface device.

[0052] In a particular embodiment, the disk drive unit may include a computer-readable medium in which one or more sets of instructions, e.g. software, can be embedded. Further, the instructions may embody one or more of the methods or logic as described herein. In a particular embodiment, the instructions may reside completely, or at least partially, within the main memory, the static memory, and/or within the processor during execution by the computer system 150. The main memory and the processor also may include computer-readable media.

[0053] In an alternative embodiment, dedicated hardware implementations, such as application specific integrated circuits, programmable logic arrays and other hardware devices, can be constructed to implement one or more of the methods described herein. Applications that may include the apparatus and systems of various embodiments can broadly include a variety of electronic and computer systems. One or more embodiments described herein may implement functions using two or more specific interconnected hardware modules or devices with related control and data signals that can be communicated between and through the modules, or as portions of an application-specific integrated circuit. Accordingly, the present system encompasses software, firmware, and hardware implementations.

[0054] In accordance with various embodiments of the present disclosure, the methods described herein may be implemented by software programs executable by a computer system. Further, in an exemplary, non-limited embodiment, implementations can include distributed processing, component/object distributed processing, and parallel processing. Alternatively, virtual computer system processing can be constructed to implement one or more of the methods or functionality as described herein.

[0055] The present disclosure contemplates a computer-readable medium that includes instructions or receives and executes instructions responsive to a propagated signal, so that a device connected to a network can communicate voice, video or data over the network. Further, the instructions may be transmitted or received over the network via the network interface device.

[0056] While the computer-readable medium may be a single medium, the term "computer- readable medium" includes a single medium or multiple media, such as a centralized or distributed database, and/or associated caches and servers that store one or more sets of instructions. The term "computer-readable medium" shall also include any medium that is capable of storing, encoding or carrying a set of instructions for execution by a processor or that cause a computer system to perform any one or more of the methods or operations disclosed herein. [0057] In a particular non-limiting, exemplary embodiment, the computer-readable medium can include a solid-state memory such as a memory card or other package that houses one or more non-volatile read-only memories. Further, the computer-readable medium can be a random access memory or other volatile re-writable memory. Additionally, the computer-readable medium can include a magneto-optical or optical medium, such as a disk or tapes or other storage device to capture carrier wave signals such as a signal communicated over a transmission medium. A digital file attachment to an e-mail or other self-contained information archive or set of archives may be considered a distribution medium that is equivalent to a tangible storage medium. Accordingly, the disclosure is considered to include any one or more of a computer- readable medium or a distribution medium and other equivalents and successor media, in which data or instructions may be stored.

[0058] Although the present specification describes components and functions that may be implemented in particular embodiments with reference to particular standards and protocols, the invention is not limited to such standards and protocols. For example, standards for Internet and other packet switched network transmission (e.g., TCP/IP, UDP/IP, HTML, HTTP) represent examples of the state of the art. Such standards are periodically superseded by faster or more efficient equivalents having essentially the same functions. Accordingly, replacement standards and protocols having the same or similar functions as those disclosed herein are considered equivalents thereof.

[0059] The illustrations of the embodiments described herein are intended to provide a general understanding of the structure of the various embodiments. The illustrations are not intended to serve as a complete description of all of the elements and features of apparatus and systems that utilize the structures or methods described herein. Many other embodiments may be apparent to those of skill in the art upon reviewing the disclosure. Other embodiments may be utilized and derived from the disclosure, such that structural and logical substitutions and changes may be made without departing from the scope of the disclosure. Additionally, the illustrations are merely representational and may not be drawn to scale. Certain proportions within the illustrations may be exaggerated, while other proportions may be minimized. Accordingly, the disclosure and the figures are to be regarded as illustrative rather than restrictive.

[0060] Further implementations are within the claims.