? DESCRIPTION (provided by applicant): Glioma is the most common adult primary brain cancer, with inevitable recurrence and finite survival times. Safe maximal resection for glioma patients remains the standard of care based on best evidence medicine (coupled with adjuvant therapies such as radiation and chemotherapy). Multiple studies have shown clear relief of symptoms, improvement of life quality, survival advantage, and delayed recurrence for patients undergoing safe maximal extent of resection (EOR). This is highly beneficial to high-grade glioma (glioblastoma - GBM) patients, and is even more critical for low-grade glioma patients who enjoy years of improved survival. However, it has been extremely challenging to visually distinguish cancer from non-cancer brain tissue intra- operatively even by very experienced surgeons with the best clinically available technologies. On one hand, remaining cancer quickens recurrence, increases resistance to adjuvant therapies, and worsens survival; on the other hand, resection of normal functional brain (e.g. speech and motor areas) can lead to poor functional status and worse survival outcomes. Currently the lack of effective intra-operative guidance technologies prevents neurosurgeons from achieving maximal safe EOR despite of its clear survival advantage. The objective of this proposal is to develop and evaluate the ability of a high-speed, high-resolution, non- invasive and label-free optical coherence tomography (OCT) imaging technology, along with a novel tissue optical property quantification algorithm, to distinguish cancer from non-cancer in real time with high sensitivity/specificity. Our preliminary data (recently published in Science Translational Medicine) suggests exciting potential of OCT for identifying brain cancer vs. non-cancer. To fully investigate the capability and potential of OCT for label-free, quantitative and real-time assessment of brain cancer in an intra-operative setting, we propose the following aims: In Aim 1, we will develop a high- speed OCT imaging platform to identify human brain cancer infiltration with minimized motion and blood artifacts. We will also develop a novel processing algorithm for rapid and robust optical property retrieval from volumetric OCT imaging data, and a method of constructing a color-coded optical property map to provide a direct visual cue for distinguishing cancer versus non-cancer at high resolution. In Aim 2, we will perform the first systematic evaluation of OCT using ex vivo brain tissues from 30 GBM and 30 low-grade brain cancer patients, and a novel in vivo murine brain cancer model (implanted with patient-derived GBM cell lines). Using histopathological analyses as the gold standard, we will establish the first quantitative OCT diagnostic thresholds and determine the OCT sensitivity/specificity in brain cancer identification. In Aim 3, we will address the feasibiliy of brining real-time intra-operative capabilities of OCT into the operating room (OR) by conducting a pilot in vivo OCT imaging study with an additional 30 GBM and 30 low-grade brain cancer patients. This pilot study should pose minimal risk to the patient as the imaging light intensity is low, all imaging data will be collected in a sterile, non-contact manner, and the pilo study will not influence any clinical decisions nor the extent of resection. In summary, our proposed study will be the first to provide quantitative and real-time brain cancer tissue identification using a color-coded optical property map, the first to provide systematic evaluation of the translational OCT technology (involving 120 patients), and the first to provide non-invasive, label-free, non- contact and real-time differentiation of brain cancer versus non-cancer in the OR. These advances will open doors for future large-scale clinical trials to guide brain surgery and increase extent of resection, not only for glioma but also for patients with other types of cancers (such as cancers metastasized to the brain, oral, cervical and GI cancers), thereby improving patient survival.