Severity: Warning
Message: file_get_contents(https://...@pubfacts.com&api_key=b8daa3ad693db53b1410957c26c9a51b4908&a=1): Failed to open stream: HTTP request failed! HTTP/1.1 429 Too Many Requests
Filename: helpers/my_audit_helper.php
Line Number: 176
Backtrace:
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 176
Function: file_get_contents
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 250
Function: simplexml_load_file_from_url
File: /var/www/html/application/helpers/my_audit_helper.php
Line: 3122
Function: getPubMedXML
File: /var/www/html/application/controllers/Detail.php
Line: 575
Function: pubMedSearch_Global
File: /var/www/html/application/controllers/Detail.php
Line: 489
Function: pubMedGetRelatedKeyword
File: /var/www/html/index.php
Line: 316
Function: require_once
Fused filament fabrication (FFF) has been widely used in various industries, and the adoption of technology is growing significantly. However, the FFF process has several disadvantages like inconsistent part quality and print repeatability. The occurrence of manufacturing-induced defects often leads to these shortcomings. This study aims to develop and implement an on-site monitoring system, which consists of a camera attached to the print head and the laptop that processes the video feed, for the extrusion-based 3D printers incorporating computer vision and object detection models to detect defects and make corrections in real-time. Image data from two classes of defects were collected to train the model. Various YOLO architectures were evaluated to study the ability to detect and classify printing anomalies such as under-extrusion and over-extrusion. Four of the trained models, YOLOv3 and YOLOv4 with "Tiny" variation, achieved a mean average precision score of >80% using the AP50 metric. Subsequently, two of the models (YOLOv3-Tiny 100 and 300 epochs) were optimized using Open Neural Network Exchange (ONNX) model conversion and ONNX Runtime to improve the inference speed. A classification accuracy rate of 89.8% and an inference speed of 70 frames per second were obtained. Before implementing the on-site monitoring system, a correction algorithm was developed to perform simple corrective actions based on defect classification. The G-codes of the corrective actions were sent to the printers during the printing process. This implementation successfully demonstrated real-time monitoring and autonomous correction during the FFF 3D printing process. This implementation will pave the way for an on-site monitoring and correction system through closed-loop feedback from other additive manufacturing (AM) processes.
Download full-text PDF |
Source |
---|---|
http://www.ncbi.nlm.nih.gov/pmc/articles/PMC10280217 | PMC |
http://dx.doi.org/10.1089/3dp.2021.0231 | DOI Listing |
Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!