Project Participants: Huntington Ingalls Industries – Ingalls Shipbuilding
Project Start: August 2020
A surface combatant such as the Arleigh Burke-class (DDG 51) guided missile destroyer is built from hundreds of thousands of parts, each of which is selected by Ingalls Engineering, sourced and purchased by Ingalls Supply Chain, and installed by Ingalls Operations in accordance with ship design requirements. Engineers must perform extensive research to identify and select required parts meeting ship design specifications. During this identification and selection phase, engineers unknowingly identify and select parts believed to be new to the ship’s design when in fact the parts have already been used in other areas of the ship, or on other vessels that have been built. The primary objective of this project is to reduce the time it takes engineers to research, identify and select parts and to reduce the number of parts that are duplicated each year, thereby reducing the engineering, supply chain, and associated labor with respect to new part creation.
The Visual Search Engine project, managed by the Naval Shipbuilding and Advanced Manufacturing (NSAM) Center will investigate the employment of new technologies enabling component searches across all libraries and databases used in the design process for parts that have appropriate or similar fit, form, and function. The anticipated solution space is expected to utilize content-based image retrieval (CBIR), also known as query by image content (QBIC) and content-based visual information retrieval (CBVIR). CBIR is the application of computer vision techniques to image retrieval problems. To accomplish this, Ingalls Shipbuilding will “index” parts catalogs via “visual fingerprinting” (e.g., the attachment of image data to components in text-based libraries). Engineers and Supply Chain technicians will then only need to provide a shape input to the search engine to locate parts of similar shapes, fits, forms, and functions. “Content-based” means that the search analyzes the contents of the image rather than the metadata such as keywords, tags, or descriptions associated with the image, which is the “long pole in the tent” with respect to performing parts searches in Ingalls’ parts libraries. The term “content” in this context might refer to colors, shapes, textures, or any other information that can be derived from the image itself. CBIR is desirable because searches that rely purely on metadata are dependent on annotation quality and completeness, and often are very time consuming to complete.
This project is expected to result in savings of approximately $2.6M annually across all platforms. Five-year savings of $7.6M. The Visual Search Engine technology is expected to be implemented at Ingalls facility during the fourth quarter of FY23.
Project Related Reports & Documents