TraceParts lends its support to Stanford University for a scan-to-CAD research project.


The TraceParts CAD model library comes to the aid of researchers in the Computer Science Department at Stanford University, United States.

Saint-Romain, France, April 18,2019 – A team at Stanford University has worked on a research project generating CAD models from digitized 3D data.

Minhyuk Sung, a computer science Ph.D. student at Stanford University, advised by Leonidas Guibas, professor of computer science at Stanford proposed a new scan-to-CAD system, parameterizing a scanned 3D point cloud with multiple geometric primitives including plane, sphere, cylinder, and cones. Through the output primitives, the users can easily manipulate the scanned data in many downstream applications such as shape editing.

Figure 1. The method takes a raw scanned 3D point cloud data and converts it a set of geometric primitives including ones for tiny segments. The output primitives can be used in many applications such as shape editing.

In their research, the aim is to design a stable and fully automatic system running without any user control. Existing techniques fitting primitives to the input point cloud include user parameters, fine-tuning of which for each input is crucial for performance. The team avoids the problem and automates the fitting process by introducing a deep-learning-based system that predicts the high-level information from the input data and estimates the primitives more robustly based on the information.

The prediction in the pipeline is enabled by training a neural network with supervision in a large-scale CAD database, which is provided from TraceParts. The figure below shows a comparison between the existing approach (1st row) and the proposed method (2nd row). The proposed deep-learning-based system can handle challenging cases more robustly; e.g., at the right end, two adjacent cylinders with slightly different radii (green and light blue) are not correctly recognized using the existing technique but properly distinguished with the novel method.

Figure 2. Comparison between results of the existing technique (1st row) and the proposed method (2nd row). Primitives that are very close to each other can be easily confused, but the proposed learning-based method can process such cases more robustly and accurately.

To carry out the research, we were on the lookout for a large-scale CAD database that we could use in our project. We came across the free 3D CAD model library from TraceParts. The TraceParts team invited us to use their API to create our application and thereby connect their CAD database to our application. Access to the TraceParts database is a tremendous opportunity for us because it is vast and sufficiently diversified. It is perfect for supporting several research projects,

explains Minhyuk Sung about the TraceParts library.

In addition to Sung and Guibas, the research team is comprised of Lingxiao Li, Anastasia Dubrovina, and Li Yi at Stanford university. This research work will be presented at CVPR 2019, a top-tier computer vision conference, which will be held in Long beach, California, USA, between June 16 and 20, 2019.

About TraceParts

TraceParts is one of the world’s leading providers of 3D digital content for engineering. As part of the Trace Group founded in 1990, the company provides powerful web-based solutions, such as CAD part libraries, electronic catalogs and product configurators.

TraceParts offers digital marketing services to help part vendors, 3D printing suppliers, software and computer hardware vendors promote their products and services and generate high-quality B2B sales leads.

The TraceParts portal is available free-of-charge to millions of CAD users worldwide. It provides access to hundreds of supplier catalogs and more than 100 million CAD models and product data-sheets that perfectly meet the specific needs of design, purchasing, manufacturing and maintenance processes and operations.

The TraceParts Blog

For component suppliers, design software or computer hardware vendors and 3D printing & rapid prototyping professionals.