Automated Writing of Problem Reports

Navy SBIR 25.1- Topic N251-025
Naval Sea Systems Command (NAVSEA)
Pre-release 12/4/24   Opens to accept proposals 1/8/25   Closes 2/5/25 12:00pm ET    [ View Q&A ]

N251-025 TITLE: Automated Writing of Problem Reports

OUSD (R&E) CRITICAL TECHNOLOGY AREA(S): Trusted AI and Autonomy

The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with the Announcement. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws.

OBJECTIVE: Develop a software solution that utilizes artificial intelligence (AI) and machine learning (ML) to automate the generation of accurate and consistent problem reports for surface navy platforms.

DESCRIPTION: The Navy's Aegis Combat System (ACS) currently relies on manually generated problem reports using testing, analysis, and operator feedback. These problem reports are currently populated by personnel with varying degrees of experience. Information is provided by personnel attempting to identify issues that may or may not agree with the written performance specifications for their systems. This produces a lack of details needed to reproduce the issues. The process is time consuming, prone to human error, and lacks consistency across platforms and personnel. The Navy seeks an innovative software solution that is capable of automatically generating accurate and reliable problem reports for surface navy platforms, replacing manual processes used in the current ACS. This can be accomplished either through merging, removing, or automatically generating problem reports to reduce operator workload and improve efficiency. Added capability would provide more accurate problem reports written against specifications for human understanding and improvement of human system integration (HSI). Currently there are no commercially available remedies found to solve this situation.

The observation of anomalous behaviors or unexpected testing and analysis results will drive generation of problem reports. The software solution needs to provide a system that can perceive, recognize, learn, decide, and act on their own. The solution will need to consolidate and interpret data. The software solution will utilize ML systems with the ability to explain their rationale, characterize their strengths and weaknesses, and convey understanding of how they will behave in the future. The software application will also need to be capable of analyzing existing reports to understand format and content requirements. It will need to accept input from the test community and operators, automatically generate problem reports that are accurate and reliable based on various data sources, including sensor data, test observations, and operational experiences, and integrate seamlessly with all elements of the ACS.

Because of the planned implementation for both operational and testing environments, the software application should permit realistic testing of evolving threat types and configurations in a dynamic test environment. This will enable the proficiency of testing and certification problem reporting timelines for new Aegis program timelines. This will also help in maintaining or improving product quality through the early detection of deficiencies in the product. The speed and accuracy of the solution must exceed existing ACS performance attributes by 10% or better.

Work produced in Phase II may become classified. Note: The prospective contractor(s) must be U.S. owned and operated with no foreign influence as defined by 32 U.S.C. § 2004.20 et seq., National Industrial Security Program Executive Agent and Operating Manual, unless acceptable mitigating procedures can and have been implemented and approved by the Defense Counterintelligence and Security Agency (DCSA) formerly Defense Security Service (DSS). The selected contractor must be able to acquire and maintain a secret level facility and Personnel Security Clearances. This will allow contractor personnel to perform on advanced phases of this project as set forth by DCSA and NAVSEA in order to gain access to classified information pertaining to the national defense of the United States and its allies; this will be an inherent requirement. The selected company will be required to safeguard classified material during the advanced phases of this contract IAW the National Industrial Security Program Operating Manual (NISPOM), which can be found at Title 32, Part 2004.20 of the Code of Federal Regulations.

PHASE I: Develop a concept for a problem report solution that demonstrates it can feasibly meet the parameters in the Description. Feasibility will be demonstrated through modeling and simulation. The Phase I Option, if exercised, will include the initial design specifications and capabilities description to build a prototype solution in Phase II.

PHASE II: Develop and deliver a prototype problem report solution based on the results of Phase I that meets the parameters described in the Description. Demonstration will take place at a government provided facility. The government subject matter expert will evaluate the prototype to ensure it improves situational visualization and situational understanding within a varied problem reporting context.

It is probable that the work under this effort will be classified under Phase II (see Description section for details).

PHASE III DUAL USE APPLICATIONS: Support the Navy in transitioning the technology for Government use within the Aegis Weapon System (AWS in Advanced Capability Build (ACB) 20 or higher) as part of an Integrated AWS database. Refine the prototype for integration into the current AWS operational planning tools. Test and refine the prototype design for the appropriate interfaces with other Navy systems and to comply with information security requirements.

The developed technology should be broadly applicable to live testing of manned and unmanned systems and simulations in which users need Course of Action (COA) planning and updates to the plan as time progresses. Dual use applications are numerous, almost any analyst seeking to combine spatial and temporal data in a single display could use this technology including the Federal Aviation Administration or civilian air controllers.

REFERENCES:

1. Campos, Luna. "Data Synchronization: What It Is and How to Sync Data for Beginners." HubSpot, December 05, 2022. https://blog.hubspot.com/website/data-synchronization

2. Software Solution.

https://www.sciencedirect.com/topics/computer-science/software-solution

https://www.geeksforgeeks.org/ai-tools-for-generating-reports/#jenni-ai

3. "National Industrial Security Program Executive Agent and Operating Manual (NISP), 32 U.S.C. § 2004.20 et seq. (1993)." https://www.ecfr.gov/current/title-32/subtitle-B/chapter-XX/part-2004

KEYWORDS: Anomalous Behaviors; unexpected testing and analysis results; automatically generate problem reports; Machine Learning to automate; testing and certification problem reporting timelines; consistency across platforms


** TOPIC NOTICE **

The Navy Topic above is an "unofficial" copy from the Navy Topics in the DoD 25.1 SBIR BAA. Please see the official DoD Topic website at www.dodsbirsttr.mil/submissions/solicitation-documents/active-solicitations for any updates.

The DoD issued its Navy 25.1 SBIR Topics pre-release on December 4, 2024 which opens to receive proposals on January 8, 2025, and closes February 5, 2025 (12:00pm ET).

Direct Contact with Topic Authors: During the pre-release period (December 4, 2024, through January 7, 2025) proposing firms have an opportunity to directly contact the Technical Point of Contact (TPOC) to ask technical questions about the specific BAA topic. Once DoD begins accepting proposals on January 8, 2025 no further direct contact between proposers and topic authors is allowed unless the Topic Author is responding to a question submitted during the Pre-release period.

DoD On-line Q&A System: After the pre-release period, until January 22, at 12:00 PM ET, proposers may submit written questions through the DoD On-line Topic Q&A at https://www.dodsbirsttr.mil/submissions/login/ by logging in and following instructions. In the Topic Q&A system, the questioner and respondent remain anonymous but all questions and answers are posted for general viewing.

DoD Topics Search Tool: Visit the DoD Topic Search Tool at www.dodsbirsttr.mil/topics-app/ to find topics by keyword across all DoD Components participating in this BAA.

Help: If you have general questions about the DoD SBIR program, please contact the DoD SBIR Help Desk via email at [email protected]

Topic Q & A

1/5/25  Q.
  1. What specific data sources (e.g., sensor data, operator feedback) will be available for integration into the automated problem report system? Are there any constraints or preferences for data formats?
  2. The RFP specifies a 10% improvement over existing performance attributes. Can you provide baseline metrics or details on how performance is currently measured?
  3. How should the solution balance functionality for operational environments versus testing scenarios? Are there unique requirements for either use case?
  4. The software must explain its rationale and characterize strengths and weaknesses. What level of detail is required for these explanations, and should they be tailored for technical or non-technical users?
  5. Since Phase II work may become classified, are there specific security protocols or considerations for the initial development phase to ensure seamless transition to classified work?
  6. How critical is human-system integration in the automated report generation process? Should the solution include features for operator validation or real-time adjustments?
  7. For potential dual-use commercialization, are there specific industries (e.g., aviation, healthcare) or applications that should be prioritized for adaptability beyond the Navy’s Aegis Combat System?
   A.
  1. This will be provided after the Phase I award by the TPOC
  2. Aggregate performance to be the 10%; 7% + 14% is 21 Divided by 2 is 10.5% improvement
  3. This will be provided after the Phase I award by the TPOC
  4. This will be provided after the Phase I award by the TPOC
  5. Phase I’s are unclassified, I recommend all companies have their DD254’s and FCL’s in good order for a verity of SBIR Topics.
  6. I would not want to limit your innovative capabilities for the BAA Postings
  7. A Small Business should always have commercialization in mind while performing SBIR Topics
12/31/24  Q. A) Data Collection and Schema

    1. Sensor Data:
  • Could you provide a schema or description of the types of sensor data collected by ACS (e.g., radar, sonar, environmental)?
  • What formats are typically used for this data (e.g., CSV, JSON, XML)?
  • Does the sensor data include real-time streams, historical logs, or both?
    2. Test Observations:
  • What format are test observations usually documented in (e.g., CSV files, Excel sheets, free-text reports). Do test observations contain quantitative metrics, qualitative notes, or both?
  • Are these observations structured or variable in nature?
    3. Operational Experiences:
  • How are operational experiences documented (e.g., operator notes, audio logs, free-text annotations)?
  • Is there any variability in how different operators document their observations?
    4. Data Similarity:
  • If actual data cannot be shared, can you provide examples or similar data structures to help us design relevant models and pipelines?
  • Are there any publicly available datasets or simulations that closely resemble the data types used by ACS?

B) Report Requirements:

    1. Content and Format:
  • Are there any specific templates or formats the problem reports must adhere to? Would it be possible to get a sample problem report?
  • What are the must-have components in the reports (e.g., root cause analysis, data visualizations, performance metrics)?
    2. Summarization and Merging:
  • Should the tool prioritize merging data from multiple sources (e.g., sensor logs, test observations) into a single cohesive report?
  • Are there challenges with merging inputs from different sources that we should account for?
    3. Explainability:
  • How important is it for the tool to provide traceability or explanations for how the report was generated?
C) AI/ML Model Considerations

    1. Model Scope:
  • Should we consider different models for handling structured numerical data (e.g., sensor logs) and unstructured data (e.g., operator notes)?
  • Are there preferences for lightweight models versus more complex models like transformers for text summarization?
    2. Evaluation Metrics:
  • What metrics will be used to evaluate the performance of the final system (e.g., report accuracy, completeness, speed)?
  • Should the system prioritize precision (fewer false positives) or recall (fewer missed issues)?
D) System Constraints

    1. Legacy System Integration:
  • Are there legacy system constraints or compatibility requirements we should be aware of when designing the proposed solution?
    2. Handling Incomplete Data:
  • Should the tool attempt to generate partial reports in cases where data from certain sources is missing?
   A. A) Data Collection and Schema

    1. Sensor Data:
  • Yes, Aegis Platforms have a wide verity of sensors that collect data in the performance of their duties
  • Format can be discussed after Phase I award
  • Yes
    2. Test Observations:
  • Documentation of TORs are typically manual inputs.
  • Yes, I would not want to restrict your innovative approach
    3. Operational Experiences:
  • Yes, operators may be junior or senior personnel, long or brevity
  • Yes, operators may be junior or senior personnel, long or brevity
    4. Data Similarity:
  • Structures can be supplied after Phase I award. Develop a concept for a problem report solution that demonstrates it can feasibly meet the parameters in the Description.
  • References have been provided within the BAA Listing.

B) Report Requirements:

    1. Content and Format:
  • Templates can be supplied after Phase I award. Develop a concept for a problem report solution that demonstrates it can feasibly meet the parameters in the Description.
  • This can be supplied after Phase I award. Develop a concept for a problem report solution that demonstrates it can feasibly meet the parameters in the Description.
    2. Summarization and Merging:
  • Prioritizing data will be the direction of the TPOC after Phase I award.
  • I would not want to limit your innovative capabilities
    3. Explainability:
  • I would not want to limit your innovative capabilities
C) AI/ML Model Considerations

    1. Model Scope:
  • I would not want to limit your innovative capabilities
  • I would not want to limit your innovative capabilities
    2. Evaluation Metrics:
  • This will be provided after Phase I award
  • I would not want to limit your innovative capabilities
D) System Constraints

    1. Legacy System Integration:
  • Not that I’m aware of at the moment
    2. Handling Incomplete Data:
  • I would not want to limit your innovative capabilities
12/20/24  Q. 1. What is the expected user input for automatically generating user reports?
2. Is the auto-generation of problem reports expected to follow a template-based approach, leverage unique Natural Language Generation (NLG), or utilize a hybrid method combining both?
   A. 1. Current input is manual by the operator and the desire is for automation.
2. The Navy does not want to limit your innovative capability for the topic


[ Return ]