In this study, a simple yet versatile method is proposed for identifying the number of exfoliated graphene layers transferred on an oxide substrate from optical images, utilizing a limited number of input images for training, paired with a more traditional number of a few thousand well-published Github images for testing and predicting. Two thresholding approaches, namely the standard deviation-based approach and the linear regression-based approach, were employed in this study. The method specifically leverages the red, green, and blue color channels of image pixels and creates a correlation between the green channel of the background and the green channel of the various layers of graphene. This method proves to be a feasible alternative to deep learning-based graphene recognition and traditional microscopic analysis. The proposed methodology performs well under conditions where the effect of surrounding light on the graphene-on-oxide sample is minimum and allows rapid identification of the various graphene layers. The study additionally addresses the functionality of the proposed methodology with nonhomogeneous lighting conditions, showcasing successful prediction of graphene layers from images that are lower in quality compared to typically published in literature. In all, the proposed methodology opens up the possibility for the non-destructive identification of graphene layers from optical images by utilizing a new and versatile method that is quick, inexpensive, and works well with fewer images that are not necessarily of high quality.

Download full-text PDF

Source
http://dx.doi.org/10.1088/1361-6528/ace979DOI Listing

Publication Analysis

Top Keywords

graphene layers
20
optical images
12
green channel
12
proposed methodology
12
layers optical
8
versatile method
8
images utilizing
8
identification graphene
8
graphene
7
images
7

Similar Publications

Want AI Summaries of new PubMed Abstracts delivered to your In-box?

Enter search terms and have AI summaries delivered each week - change queries or unsubscribe any time!