This code has been developed in order to perform the fractal analysis of particle contour. It characterises the morphology of soil particles over the range of experimentally accessible scales through three new non-dimensional quantitative morphological descriptors characterising: (i) overall particle shape at the macro-scale, (ii) particle regularity at the meso-scale, and (iii) particle texture at the micro-scale. The characteristic size separating structural features and textural features emerges directly from the results of the fractal analysis of the contour of the particle, and is a decreasing fraction of particle dimension.
For deeper details and applications of the method, please read the following publications:
-
G Guida, F Casini, GMB Viggiani (2017), "Contour fractal analysis of grains", EPJ Web of Conferences, 140, 05008
-
G Guida, G Viggiani, F Casini (2020), "Multi-scale morphological descriptors from the fractal analysis of particle contour", Acta Geotechnica 15 (5), 1067-1080
-
G Guida, D Sebastiani, F Casini, S Miliziano (2019), "Grain morphology and strength dilatancy of sands", Géotechnique Letters 9 (4), 245-253
-
G Guida, F Casini, GMB Viggiani (2021), "A fractal analysis method to characterise rock joint morphology", IOP Conference Series: Earth and Environmental Science, 833 (1), 012067
-
D Sebastiani, S Miliziano, G Guida, F Casini (2021) "Preliminary evidences on the influence of grains micro-structural features on the TBM tools wear", Geotechnical Aspects of Underground Construction in Soft Ground, 401-407
Fractal analysis stems from the observation that the measured length of the contour of many natural
irregular closed shapes,
[1.1]
where
The equation [1.1] implies that the length of the contour of any truly-fractal closed shape deiverges to infinity as the measurement scale tends to zero. When dealing with physical objects, indefinite subdivision
of space does not make sense, as the minimum measurement scale would be limited at least by
the distance at the atomic level, while in practice, well before this is achieved, it is limited by the
experimental resolution with which the contour of the particle is defined. However, the plot of the
length of the contour of a particle, p, versus measurement scale,
[1.2]
The input for the fractal analysis is a 2D image of the particle of any resolution, as obtained,
In order to obtain quantitative information from images, they were processed by contrast enhancement, binarization and segmentation. Contrast enhancement increases image sharpness thus facilitating subsequent binarization and segmentation. The method by Otsu 1979 was used to obtain the binary version of the original greyscale image, by converting each pixel to either white (foreground) or black (background), based on a threshold value. When the particles were not in contact in the binarized image, they were identified simply by labelling areas composed by all connected foreground pixels. In more complex situations, for instance when processing images containing grains in contact, a watershed algorithm Beucher 1992 was used for segmentation purposes.
Figure 1.1a shows schematically a binarized particle image after segmentation, in which white pixels correspond to the particle while black pixels correspond to the background. The contour of the particle can be extracted by subtracting from the binarized image of the particle its 8th-connected eroded, to obtain the external line of 8th-connected pixels (Fig. 1.1b) in the form of a vector containing their coordinates (
Figure 1.2 shows an example of the results of the fractal analysis applied to a natural grain of Toyoura
sand. A SEM photograph of the sand grain at a magnification factor of 300x (Fig. 1.2a) was taken from SEM images online database (http://web.utk.edu/~alshibli/research/MGM/archives.php).
Figure 1.2b shows the same grain after binarization and segmentation; in this case, careful contrast
enhancement was required to avoid altering the contour due to the presence of shadows and overlapping.
The characteristic dimension of the particle is
Figure 1.2d reports the results of the analysis in terms of
The results in Figure 1.2 are typical of natural sand particles. They confirm the findings by Orford
and Whalley (1983) on the emergence of two distinct self-similar patterns describing structural and
textural features. Moreover, the characteristic scale separating the two, which may be regarded as
the maximum size of the micro-asperities, emerges directly from the results and corresponds to the
stick length at the point of intersection of the two linear portions of the plot,
The absolute values of the slopes of the two linear portions of the plot relate to the fractal dimension
in Eq. 1.1, and increase with the complexity of the profile (Vallejo, 1996). These can be obtained by
automatic linear regression in the
Figure 1.3 illustrates this procedure for the contour of the Toyoura sand particle of Figure 1.2.
In this case, the first linear regression ends at a value of
[1.3]
For example in Figure 1.2,
The matlab code 'main.m' takes as input a two-dimensional binarized image of a grain from the folder 'grains' and gives back fractal analysis results. Binarized image means black and white coloured, bw, where whites represents the objects and blacks the background. If user has not got a bw image of the grain, the Matlab code 'elab_I.m', processes different type of image in order to obtain the right binarized input data. The image type accepted by Matlab are list in the following link: https://it.mathworks.com/help/ matlab/import_export/supported-file-formats.html.
Figure 2.1 describes the step of elab_I.m code explained below:
-
ORIGINAL IMAGE - loading the input image from the folder named Grains;
-
CROPPED GRAYSCALE GRAIN - The grain of interest is manually selected by the command imcrop and transformed into a gray scale image. [NOTE: It is easier to work on grayscale images because they are two-dimensional matrix, while coloured are three-dimensional];
-
INVERTED - If the original image has the background brighter than the object it is necessary to invert the image I=1-I; Else, just comment by using "%", %I=1-I;
-
NOISE REDUCTION - application of Gaussian filter = pixels values smoothing, in order to attenuate noise;
-
CONTRAST ENHACEMENT - histograms stretching in order to make brighter the higher value pixels and darker the lowest value pixels;
-
BINARIZED - imposing the Otsu threshold.
For each image it is necessary to make sure that the contour of the binarized footprint matches the contour of the grain under examination.
There are other properties that have to be set:
-
The entire name of the file, "filename", containing the input image, inside the folder Grains;
-
The number of sets of stick lengths "n_b";
-
The tolerance "tol" on linear regressions.
The function "[boundary,cc]=boundary(bw)", takes as input the binarized image bw and gives back:
-
the boundary, a matrix formed by two columns, x and y coordinates, and many rows as the number of pixels of the grain contour;
-
the image "cc", containing all the connected pixels of the input image. Figure 3.1 reports an example of boundary extraction.
From the connected pixels image is possible to extract some geometric and shape properties, by the use of function "[geom,shape] = descriptor(cc)", that gives as output two vectors:
-
"geom" containing geometrical information such "Equivalent Diameter",
$D$ , or the diameter of a circle with the same area of the grain, the perimeter,$P$ , and the area,$A$ , of the particle; -
"shape" containing morphological information such circularity,
$C = 4\pi A/P^2$ , that measures how the object is similar to circle, aspect ratio,$AR = D_{Feret,min}/D_{Feret,max}$ where D_{Feret,min/max} is the minimum/maximum distance between two parallel tangent to the particle, that measures the elongation, and convexity,$CX = A/A_{conv}$ where$A_{conv}$ is the area of the convex polygon tangent the particle, that measures the objects concavities. All these shape descriptors assume values ranging between 0 and 1, where 1 means the perfect circular shape. Geometric and shape descriptors are printed in the command window during running the code.
The "function[b_D,p_D]=fractal_analysis(geom,boundary,n_b)" computes the contour length of
the particle using set of sticks as units. It takes as input the geometrical information of the particle
"geom", the boundary coordinates "boundary", and the number of stick sizes "n_b". The outputs are two
vectors, "b_D" and "p_D", one containing the stick lengths, and the other the values of perimeters related
to the stick length adopted, both normalized by the equivalent perimeter,
-
The first loop runs over the starting points, so all the indication described in that loop are repeated for each starting point;
-
The second loop runs over the sticks lengths taking time by time the subsequent smaller one;
-
The third loop runs along the particle boundary using the stick as a unit.
This is the most time consuming part of the code, that anyway takes few minutes to run in a standard performance computer, with the default inputs. A percentage counter in the command window updates the status of the calculation. If it stays for too long, maybe your processor is not enough fast, therefore, try to
decrease the number of sticks "n_b", or the lower bound limit in the vector bacc (see Section 4.2) from
The first loop runs the analysis over different starting points "nn", corresponding to each point of the
particle boundary. Everytime the loop restarts, the matrix of boundary coordinates is sorted into the
temporary "bound" matrix, placing at the first index the coordinates related to the starting point.
A first check consists to control that the first and biggest stick, equal to
The second loop, inside the first, runs over the stick lengths taken from the vector "bacc". Every time a new stick is used, the starting point coordinates are newly imposed. The starting coordinates "x0" and "y0" are taken from the sorted boundary coordinates bound defined in the 1st Loop.
The third loop detects the intersection point between the stick and particle boundary. In practice it consists to fix one end of the stick in the starting point and to detect the intersection between the other end of the stick and the boundary, following the pixels order. The new intersection becomes the starting point for the next detection. It is coded on a "while" loop that increases the counter "j", initially equal to the index related to the starting point, till the particle contour ends, "j==n". The loop detects intersection points following the condition: if "dist>=b" or if the distance dist between the starting point and the increasing "j" point of the boundary becomes longer than the stick length "b". In this case the new intersection point is found, and the relative "j", becomes the index of the starting point for the next intersection detection. The intersection coordinates are stored in the matrix "p(m x nn)" for each stick length "m" and for each starting point "nn". The perimeters is computed summing the distances between each intersection point, normalized by the equivalent diameter, and memorized inside the matrix, that has many rows as the number of stick lengths and many column as the number of contour points. The output vectors of perimeters "p_D", chooses the minimum value of contour length, related to the associated stick length "b_D", along all the starting points adopted.
The fractal descriptors are evaluated by the function:
"[mhu,m,M,ind_mhu,ind_m,ind_M]=MyMorphology(p_D,b_D)", that takes as input the results of fractal analysis, "p_D" and "b_D", containing the value of minimum perimeter and the associated stick length respectively, giving back as outputs the fractal morphological descriptors:
The fractal analysis results are reported in a log-log space (Figure 5.1), "b_D"
When the stick length is smaller than the pixel size, the log-log plot could present a
"function[ind,m_fr,Q]=regression(X,Y,toll)" that takes as inputs "X" and "Y", equal to the logarithm of the variable "b_D" and "p_D" respectively, and the tolerance "tol", imposed at the beginning as a data input, and gives back the regression data: vector index of regression end ("ind"), its slope ("m_fr") and the intercept ("Q"). Starting from the smallest stick length the regression function performs a while loop in order to detect automatically the two linear trends:
it looks for the maximum number of subsequent points in order that the correlation coefficient,
Data are saved in a file "results.xls", inside a sheet with the same name of the grain image. In this way, all the analysis can be stored in a same file, but automatically on different sheets. The data saved are the following:
-
The pixel coordinates
$x$ [px] and$y$ [px] of the grain contour; -
The stick length
$b/D$ and the relative perimeter values$p/D$ , results of fractal analysis; -
The geometric properties of the grain: equivalent diameter [px], perimeter [px], and area [px$^2$];
-
The shape properties computed by Matlab: Circularity,
$C$ , Aspect Ratio,$AR$ and Convexity,$CX$ ; -
The fractal morphological descriptors:
$M$ ,$m$ and$\mu$ ; -
The initial, ($b/D|{min}$; $p/D|{min}$), intermediate, (
$b/D|_m; p/D|_m$ ), and final coordinates, (b_D|_M; p/D|_M$), of the regressions.
Beucher, Serge et al. (1992). "The watershed transformation applied to image segmentation". In: SCANNING MICROSCOPY-SUPPLEMENT-, pp. 299-299.
Mandelbrot, Benoit (1967). "How long is the coast of Britain? Statistical self-similarity and fractional dimension". In: science 156.3775, pp. 636-638.
Orford, JD and WB Whalley (1983). "The use of the fractal dimension to quantify the morphology of irregular-shaped particles". In: Sedimentology 30.5, pp. 655-668.
Otsu, Nobuyuki (1979). "A threshold selection method from gray-level histograms". In: IEEE transactions on systems, man, and cybernetics 9.1, pp. 62-66.
Stachowiak, GW (1998). "Numerical characterization of wear particles morphology and angularity of particles and surfaces". In: Tribology International 31.1-3, pp. 139-157.
Vallejo, L.E. (1996). "Fractal analysis of granular materials". In: Géotechnique 45.1, pp. 159-163.
Von Koch, Helge (1906). "Une méthode géométrique élémentaire pour l'étude de certaines questions de la théorie des courbes planes". In: Acta mathematica 30.1, pp. 145-174.