Machine learning (ML) has made great advancements in imaging for breast cancer detection, including reducing radiologists read times, yet its performance is still reported to be at best similar to that of expert radiologists. This leaves a performance gap between what is desired by radiologists and what can actually be achieved in terms of early detection, reduction of excessive false positives and minimization of unnecessary biopsies. We have seen a similar situation with military intelligence that is expressed by operators as "drowning in data and starving for information". We invented Upstream Data Fusion (UDF) to help fill the gap. ML is used to produce candidate detections for individual sensing modalities with high detection rates and high false positive rates. Data fusion is used to combine modalities and dramatically diminish false positives. Upstream data, that is closer to raw data, is hard for operators to visualize. Yet it is used for fusion to recover information that would otherwise be lost by the processing to make it visually acceptable to humans. Our research with breast cancer detection involving the fusion of Digital Breast Tomosynthesis (DBT) with Magnetic Resonance Imaging (MRI) and also the fusion of DBT with ultrasound (US) data has yielded preliminary results which lead us to conclude that UDF can help to both fill the performance gap and reduce radiologist read time. Our findings suggest that UDF, combined with ML techniques, can result in paradigm changes in the achievable accuracy and efficiency of early breast cancer detection.