SSM Using Scalismo (5): Creating a point distribution model

Finally, I arrive at statistical shape models. I am going to create a point distribution model (PDM) of shapes. Reading some theory on the point distribution models, I need superimposed shapes and their mean shape. Superimposed shapes and their mean shapes are available from my previous tutorial on GPA with Scalismo. The outcomes of the full GPA is going to be utilized for creating a PDM. The PDM in this tutorial uses shapes (not forms), and it is based on the multivariate normal distribution of the points’ coordinates collected in a shape vector (Eq. 16 in this post).

This post contains code blocks; if your browser does not automatically display the code blocks, please click on the displayed dark narrow rectangular areas to display the containing code.

I start with initializing Scalismo and loading the superimposed configurations, i.e. shapes. Following the previous tutorial, I use the femoral head shapes.

object PDM extends App {

  import scalismo.geometry._
  import scalismo.common._
  import scalismo.ui.api._
  import scalismo.mesh._
  import scalismo.io.{StatisticalModelIO, MeshIO}
  import scalismo.statisticalmodel._
  import scalismo.registration._
  import scalismo.statisticalmodel.dataset._
  import scala.io.StdIn.{readLine, readInt}

  scalismo.initialize()
  implicit val rng = scalismo.utils.Random(42)
  // ui
  val ui = ScalismoUI()

  // Loading data
  val drive = "I:/"
  val meshDir = drive + "registration/Full GPA/"
  val unsortedMeshFiles = new java.io.File(meshDir).listFiles()
  unsortedMeshFiles.foreach(file => println(s"\n$file"))
  println(unsortedMeshFiles)
  // specifying the file name of the mean mesh generated through the full GPA
  val meanMeshName = "meanMesh.stl"
  val meshFileNames = unsortedMeshFiles.map(file => file.getName())
  val indexOfMeanMesh = meshFileNames.indexOf(meanMeshName)
  // bringing the meanMesh to the head (first entry) of the meshFiles sequence
  val meshFiles = unsortedMeshFiles.updated(0, unsortedMeshFiles(indexOfMeanMesh)).updated(0, unsortedMeshFiles(indexOfMeanMesh))

I have to choose a reference mesh and some training meshes. The full GPA mean mesh is chosen as the reference mes and the three (registered and superimposed) meshes as the training meshes.

// specifying the reference mesh and the training meshes; the full GPA mean mesh is set to be the reference mesh
  val refMesh = MeshIO.readMesh(meshFiles.head).get
  val trainingMeshes = meshFiles.tail.map(file => MeshIO.readMesh(file).get)//the other 3 meshes

Once a reference mesh and training meshes are chosen, the deformation vector from each point of the reference mesh to its corresponding point of each training mesh should be calculated. Thereby, the deformation fields morphing the reference mesh to the training meshes are obtained.

 // [The meshes are already aligned using the full GPA, and in correspondence, so we can create a discrete GP]
  // creating the deformation vector field
  val deformationFields = trainingMeshes.map(mesh => {
    val deformationVectors = refMesh.pointSet.pointIds.map { id =>
      mesh.pointSet.point(id) - refMesh.pointSet.point(id)
    }.toIndexedSeq
    // a discrete deformation field needs a domain and the deformation vectors at each point of the domain
    DiscreteField[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]](refMesh.pointSet, deformationVectors)
  })

The calculated deformation fields are samples (realizations) of a discrete random field. The domain of the random field is the reference mesh points. The random field of deformations is assumed to have a multivariate normal distribution. Here, the discrete random field is expressed by a random vector. Thereby, a shape, described by a shape vector, can be produced by adding the shape vector of the reference shape to the normally distributed random vector of deformations expanded by the deformation modes (this is explained by Eq. 16 in this post).

Technically, Scalismo works with the concept of Gaussian process (GP) which is a continuous random field whose any finite sub-collection of random variables (here deformations) has a multivariate normal (Gaussian) distribution. Therefore, continuous sample deformation fields should be approximated based on the known discrete realizations/samples of the random field. By interpolating a discrete field, Scalismo creates a continuous field. Different interpolation approaches are possible; Scalismo uses the Nearest Neighbor Interpolation method. The following code block defines a discrete GP (of the deformations) which is in fact a normally distributed random vector with elements indexed by the reference mesh points (a discrete domain). The discrete GP (random vector of deformations) is expressed in terms of its principal axes (modes). This is done using the PCA as described in this post (see Eq. 16).

  val interpolator = NearestNeighborInterpolator[_3D, EuclideanVector[_3D]]()
  val continuousDefFields = deformationFields.map(field => field.interpolate(interpolator))
  val gp = DiscreteLowRankGaussianProcess.createUsingPCA(refMesh.pointSet, continuousDefFields)

As I have 3 training meshes and therefore 3 training deformation fields, the rank of the sample covariance matrix of deformations is at most 3. This means that at most 3 principal axes are used to express the random vector of deformations.

I can sample the GP (random field) of deformations. By adding a sample deformation field to the reference mesh, I can generate a shape vector. This shape vector is a sample mesh from meshes whose points’ coordinates are normally distributed. Sampling some meshes from the mesh distribution, I’ll have the following realization:

  val PDMGroup = ui.createGroup("refMeshGroup")
  val refMeshView = ui.show(PDMGroup, refMesh, "refMesh")
  refMeshView.color = java.awt.Color.GREEN
  val gpDefView = ui.addTransformation(PDMGroup, gp, "RefMeshDefrmdByGp")
Fig. 1. some samples from the statistical point distribution model.
Fig. 1. some samples from the statistical point distribution model. One of the coefficients of the shape modes attains a relatively large value (close to its three standard deviation).

The horizontal bar-like controls in the figures are proportional to the coefficients of the deformation modes (b_i=\alpha_i\sqrt{\lambda_i}). Each coefficient varies within three standard deviations (Eq. 4 in this post).

Scalismo assembles the reference mesh and the discrete GP as a statistical mesh model:

// [creating a statistical shape model (PDM)
  val model = StatisticalMeshModel(refMesh, gp.interpolate(interpolator))

This is the statistical PDM.

SSM Using Scalismo (3): Non-rigid registration

Non-rigid registration

The rigid registration preserves the shape and size (form) of objects by merely transforming them through rotation and translation. Unlike the rigid registration, non-rigid registration (NRR) deforms (morphs) an object/configuration to have it fitted onto another object. Here, an object is a valid surface that its geometry can be discretely represented by a triangular mesh. A configuration is the set (or subset) of points/nodes of the mesh of the object. In a word, the geometry of an object is closely approximated with a finite set of points and triangles connecting the points/nodes; all together a triangular mesh. The triangles linearly interpolate the actual geometry between the points/nodes. I am going to refer to a points of a triangular mesh as a node.

But why we want to morph an object onto another object. One application is to establish one-to-one correspondence between the configurations’ points. For statistical shape modelling/analysis correspondence is essential. Another difference between the rigid and NRR is that the rigid registration entails known landmark correspondence; however, NRR seeks correspondence. Briefly, the registration problem (NRR) is the problem of establishing correspondence between two objects (their points).

Imagine two objects having configurations/meshes X and Y with different number of points; one with m and the other n points. A 2D example is depicted below. Definitely, a bijective correspondence cannot be established between every pair of points of X and Y. So, we cannot think about some point-to-point solution. The correct way of establishing the correspondence between the geometry of two objects is to morph the mesh of one of them onto the other one. In the figure below, the configuration/mesh X is morphed onto the configuration/mesh Y. Thereby, X’ closely matches Y and also its points are in correspondence with X. We have to pay attention to the difference between each two of these three terms, and their individual role in the problem: reference mesh, target mesh, fitted mesh; here, X, Y, and X’ respectively.

Terms: By deformation of an object we mean the deformation vectors or field. By transformation of an object we mean the transformed geometry.

Here, two questions bug my brain (there must be some brain there..!): 1- how to morph?, 2- how much morph? Devising and utilizing a model simulating plausible deformations is the answer to the first question, and defining a distance function to measure the distance between two meshes answers the second one.

Fig. 1. Two configurations with different number of points; one morphs to the other one.

A reference mesh \Gamma_R also acting as the domain of the functions, a target mesh \Gamma_T, a model of deformation fields u, a distance function D, and a weighted regularization term on the deformations formulate the registration problem as the following optimization problem [1]:

\hat u=\argmin_u (D[\Gamma_R,\Gamma_T,u]+\eta R[u])\tag{1}

The deformation field is a function u:\mathbb{R}^3\to\mathbb{R}^3. The distance function eats two (constant) configurations, the reference and the target ones, and also the deformation field, then it computes the distance between the target mesh and the deformed reference mesh, i.e. \Gamma = \Gamma_R+u(\Gamma_R). The regularization term R involves the prior assumption. It penalises deformations that do not fit the the prior assumption [1]. This method is called parametric NRR.

The theory of parametric NRR is the content of another post. Here, I try to explain the practical steps toward putting the theory into practice with Scalismo.

1- Initialising Scalismo

In addition to the main code, I defined some functions and organised them in a package object. The package code will be at the end of the post. How to create a package, click here.

This post contains code blocks; if your browser does not automatically display the code blocks, please click on the displayed dark narrow rectangular areas to display the containing code.

object NRR extends App {
  //To send the code selection to Scala Console, select the code in the Editor, press Ctrl + Shift + X
  //imports
  import scalismo.geometry._
  import scalismo.common._
  import scalismo.ui.api._
  import scalismo.mesh._
  import scalismo.registration.LandmarkRegistration
  import scalismo.io.{MeshIO, LandmarkIO}
  import scalismo.numerics.UniformMeshSampler3D
  import breeze.linalg.{DenseMatrix, DenseVector, csvread}
  import scalismo.registration.{Transformation, RotationTransform, TranslationTransform, RigidTransformation}
  import scala.io.StdIn.{readLine, readInt}
  import scalismo.numerics._
  import scalismo.kernels._
  import scalismo.statisticalmodel._
  import scalismo.registration._
  scalismo.initialize()
  implicit val rng = scalismo.utils.Random(42)
  // ui
  val ui = ScalismoUI()

This is to include the package:

import custom.methods._

2- Loading meshes and landmarks

We use our rigidly aligned meshes of the femoral heads and their anatomical landmarks; available in the post SSM Using Scalismo (1): Rigid alignment. The initial rigid alignments of the meshes and some landmarks in correspondence are going to mitigate the registration problem. This time, we intend to fit/morph (NRR) the reference mesh to the target meshes.

 // [Loading aligned meshes and LM files]
  val meshDir = "I:/registeration/AlignedData/Meshes/"
  val LMDir = "I:/registeration/AlignedData/LMs/"
  val registeredMeshDir = "I:/registeration/Registered/"
  // meshes
  val meshFiles = new java.io.File(meshDir).listFiles()
  meshFiles.foreach(file => println(s"${file}\n"))

  val numOfMeshes = meshFiles.size
  println(s"number of files is ${numOfMeshes}")

  // locating the RefMesh file i.e mesh0.stl. RefMesh has to be named mesh0.stl
  val refMeshIndex : Int = (0 until numOfMeshes).map(i => meshFiles(i).getName).indexOf("mesh0.stl")
  if (refMeshIndex != 0) {println("Rename refMesh File"); sys.exit() }
  val refMesh = MeshIO.readMesh(meshFiles(refMeshIndex)).get

  val targetMeshes: Array[TriangleMesh[_3D]] = meshFiles.tail.map(meshFile => MeshIO.readMesh(meshFile).get) //meshes to get fitted to refMesh
  // Loading Landarks. 
  val LMFiles = new java.io.File(LMDir).listFiles()
  // Landmarks of the reference mesh should be in LMs(0)
  val LMs = LMFiles.map(f => LandmarkIO.readLandmarksJson[_3D](f).get)

A reference mesh view if you want:

 val refMeshGroup = ui.createGroup("refMeshGroup")
 val refMeshView = ui.show(refMeshGroup, refMesh, "refMesh")
 refMeshView.color = java.awt.Color.GREEN
Fig. 1. The reference mesh (dimensions are added manually)

3- Defining a Gaussian process

The deformation field is going to be modelled by a Gaussian process (GP). GP outputs plausible/permissible deformation components at each point of the reference mesh. To define a GP we need a mean deformation field and a kernel (this post). The GP should produce deformation fields capturing the deformations needed to morph the reference mesh onto each of the target meshes. Looking at the reference mesh, we see a combination of global and local geometrical features; a dome-like shape with an indentation. So, maybe it is better to define a kernel taking different scale of deformations into account. The mean field is set to be a zero field.

val mean = VectorField(RealSpace[_3D], (_ : Point[_3D]) => EuclideanVector.zeros[_3D]) //
  val kernel1 = DiagonalKernel[_3D](GaussianKernel(sigma = 3) * 0.1, outputDim = 3)
  val kernel2 = DiagonalKernel[_3D](GaussianKernel(sigma = 7) * 2.0, outputDim = 3)
  val kernel = kernel1 + kernel2
  val GP = GaussianProcess(mean, kernel)
  val lowRankGP = LowRankGaussianProcess.approximateGPCholesky(refMesh.pointSet,GP, 0.05, NearestNeighborInterpolator())

We can view samples of deformations of the reference mesh by the GP to check whether the defined GP kernel is suitable of capturing the needed deformations or not.

 val gpSampleGroup =  ui.createGroup("gpSampleGroup")
 val refMeshView2 = ui.show(gpSampleGroup, refMesh, "refMesh")
 // Transforming whatever mesh in the gpSampleGroup by the GP-sampled deformation fields
  val gpDefView = ui.addTransformation(gpSampleGroup, lowRankGP, "RefMeshDefrmdByGp")
  removeGroupView(gpSampleGroup) // see the definition in the package
Fig.2. GP-Sampled deformations on the reference mesh.

The deformations seems ok; nothing weird like intersecting triangles occurred. Let’s hope that the kernel is going to work and continue.

4- Defining regression data for posterior GPs

Now, it is the moment that we use the known landmark correspondence information. On each mesh there are 3 (inter-mesh) corresponding landmarks. Therefore, there are known deformation vectors from the landmarks on the reference mesh to their corresponding landmarks on each of the target mesh. We want to incorporate these known deformations into our GP deformation model. This can be done through GP regression, and the resultant model, with the incorporation of the known deformations, is called the posterior GP model. The theory is touched in this post.

The landmarks have IDs and coordinates. The coordinates are not associated with the nodes. Therefore, we first have to associate/find a node closest to each landmark of the reference mesh. Here is how:

//[Defining posterior lowrankGp given the landmark transformations from reference to a target mesh]
  // finding points (nodes) corresponding/associated to/with landmarks of the refMesh
  // The nodes on the mesh closest to the landmarks are the nodes associated with the landmarks
  val refLMs = LMs(0)
  val refLMNodeIds = refLMs.map(eachLM => refMesh.pointSet.findClosestPoint(eachLM.point).id)

Generating the regression data, i.e. known deformation vectors (from the reference mesh to each target mesh).

 // generating regression data i.e known deformations from the LMs' corresponding points on the RefMesh to corresponding landmark points of the target mesh
  //landmarks are already sorted in the order of correspondence
  val knownDeformations =  (0 until numOfMeshes-1).map(i => {
    (0 until LMs(0).size).map(j => (LMs(i+1)(j).point - refMesh.pointSet.point(refLMNodeIds(j))))
  })
  // obtaining the coordinates of the refMesh points associated with the refMesh landmarks
  val refLMpoints = refLMNodeIds.toIndexedSeq.map(eachId =>refMesh.pointSet.point(eachId))

The known deformation fields (regarding each target mesh) at the reference nodes (associated with the reference landmarks) can be viewed:

// visualizing known deformation fields
  // generating a domain of points needed for a vector field definition/construction
  val refLMdomain =  UnstructuredPointsDomain(refLMpoints)
  val knownDeformationFields = knownDeformations.map(eachDefSet => {
    DiscreteField[_3D, UnstructuredPointsDomain[_3D], EuclideanVector[_3D]](refLMdomain, eachDefSet)
  })
  val knownDeformationGroup = ui.createGroup("knownDeformations")
  knownDeformationFields.map(eachfield => ui.show(knownDeformationGroup, eachfield, "knownDefs"))
  removeGroupView(knownDeformationGroup)
Fig. 3. Deformation vectors (fields) on the nodes (the landmarks) of the reference mesh. There are two known deformation vectors at each node (associated with a landmark) because there are two target meshes.

Because the known/observed deformations is usually polluted with sort of uncertainty, a noise term should be added onto the known deformations. We add a little bit of noise as:

val noise = MultivariateNormalDistribution(DenseVector.zeros[Double](3),0.001* DenseMatrix.eye[Double](3))

Each set of regression data (known/observed deformation fields plus noise) regarding each target mesh contains sequences of data in the tuple form of (data point, deformation, noise). Data point is the a point of the reference mesh at which the known noisy deformations are observed. We define the following function to generate sets of regression data regarding each target mesh. Then, we use a function to generate the sets of regression data regarding the target meshes.

def RegressionDataGenerator(dataPoints:IndexedSeq[Point[_3D]], deformations:IndexedSeq[EuclideanVector[_3D]],noise:MultivariateNormalDistribution)={
    val regressionDataSet = (0 until dataPoints.size).map(i => (dataPoints(i),deformations(i),noise))
    regressionDataSet //.toIndexedSeq
  }
// regression data sets regarding target meshes
  val regressionDataSets = (0 until numOfMeshes - 1).map(i =>  RegressionDataGenerator(refLMpoints,knownDeformations(i),noise))

5- Posterior GP and registration

The following steps are going to be taken for each target mesh, so I use a for loop:

for( n <- (0 until numOfMeshes - 1)) {

5-1- Defining the Posterior GP based on each regression data

 val posteriorGP = lowRankGP.posterior(regressionDataSets(n))

Let’s view some samples of transformation of the reference mesh deformed by the posterior GP based on one of the target meshes regression data:

 val postGpGroup = ui.createGroup("postGpGroup")
 val refMeshView3 = ui.show(postGpGroup, refMesh, "refMesh")
 val PGpView = ui.addTransformation(postGpGroup, posteriorGP, s"TtransformedByPGp $n")
 removeGroupView(postGpGroup)
Fig. 4. Sampled transformation of the reference mesh deformed by the posterior GP.

5-2- Solving the NRR problem

To solve an NRR problem based on Eq. 1, an agent modelling the deformation fields, a distance function, a regularizer, and the weight of the regularizer are needed. Scalismo reduces the NRR problem into the following minimization problem [1]:

\argmin_{\alpha_1,…,\alpha_r}D\Big[\Gamma_R,\Gamma_T,\mu+\sum_{i=1}^{r}\alpha_i\sqrt{\lambda_i}\phi_i\Big]+\eta \sum_{i=1}^{r}\alpha_i^2\tag{2}

where, \alpha_i‘s are registration coefficients minimizing the above expression, \mu is the average of the GP (here zero every where), \lambda_i‘s and \phi_i‘s are Eigen values and Eigen functions of the low-rank approximation of the GP (see this post), r is the rank of the low-rank approximation of the GP .These Eignen values/functions are known as the GP or the posterior GP is known. \eta is the weight of the regularization term \sum_{i=1}^{r}\alpha_i^2. The regularization term is an L2 regularizer.

The distance function measuring the distance between the transformed reference mesh \Gamma and the target mesh can be defined as the mean squared distance. For a moment, let’s assume that the surfaces are continuous. A point-to-point distance function is defined as:

\text{CP}_{\Gamma_T}(x):=\|x-\argmin_{x’ \in \Gamma_T} \|x-x’\|\|

where x\in\Gamma a surface, and \|.\| is the Euclidean vector norm. This function calculates the distance between each point on \Gamma and its closest point on \Gamma_T. The arg min operator finds the closest point on \Gamma_T.

The average of the distance function over the continuous domain \Gamma_R represents the average distance between \Gamma and \Gamma_T:

D[\Gamma_R, \Gamma_T,u]:=\frac{1}{S(\Gamma_R)}\int_{\Gamma_R}\big (\text{CP}_{\Gamma_T}(x+u(x))\big )^2 ds

in which S(\Gamma_R) is the area of \Gamma_R. This distance measure goes in Eq. 2 and is evaluated numerically. It is good to know that numerical evaluation of the above integral considering discrete surfaces, i.e. meshes, gets the following form:

D[\Gamma_R, \Gamma_T,u]\approx \frac{1}{n} \sum_{i=1}^{n}\big (\text{CP}_{\Gamma_T}(x_i+u(x_i))\big )^2

such that x_i\in\Gamma, n is the number of nodes/points of \Gamma_R; here the surfaces are represented by meshes. The integral can be evaluated at m\lt n number of sampled points/nodes of \Gamma_R; in this case we replaces n with m.

Minimizing Eq. 2 is through iteration. To start, we need initial values of the coefficients \alpha_i, number of sampled points/nodes for numerical evaluation of the distance measure, initial value for the regularization weight, and number of iterations.

The initial values of \alpha_i are set to be zeros by a \mathbb{R}^{r\times 1} vector:

val initialCoefficients = DenseVector.zeros[Double](posteriorGP.rank)

Now, instead of working with fixed values for number of sampled points, regularization weight, and number of iterations, the iteration procedure is divided into several stages in order to fine-tune the minimization problem. Physically, finer geometrical features (up to the capability of the GP) are considered at at each stage. To this end, we define a sequence of registration parameters, which are, the reqularization weight, number of iterations, and number of sampled points/nodes. This is how it looks like:

 val registrationParameters = Seq(
      RegistrationParameters(regularizationWeight = 1e-1, numberOfIterations = 20, numberOfSampledPoints = 1000),
      RegistrationParameters(regularizationWeight = 1e-2, numberOfIterations = 30, numberOfSampledPoints = 1000),
      RegistrationParameters(regularizationWeight = 1e-4, numberOfIterations = 40, numberOfSampledPoints = 2000),
      RegistrationParameters(regularizationWeight = 1e-6, numberOfIterations = 50, numberOfSampledPoints = 4000)
    )

where, “RegistrationParameters” is a case class; see the defining code in the package file at the end of the post.

We consider Eq. 2 to clarify the above approach. When the regularization weight is relatively large, the minimization problem leads to some relatively small coefficients, therefore modes of deformations associated with relatively small Eigen values in \sum_{i=1}^{r}\alpha_i\sqrt{\lambda_i}\phi_i, are ruled out. This means only modes having large effects in the deformations remain. These modes of deformations are more toward global deformations rather than local deformations leading to fine geometrical features. Having said that, it is reasonable to start the minimization procedure with less number of points, uniformly sampled from our discrete domain \Gamma_R, and fewer number of iterations. Thereby, a set of coefficients appropriate for an approximate global fit (of the reference mesh to the target mesh) is calculated. Then, the regularization weight is decreased so that other deformation modes with relatively smaller Eigen values can effectively appear in the minimization. This means taking care of more local deformations hence finer geometrical features. As we want to consider smaller deformations, we have to fine-sample the points, i.e. more number of points. At this level, number of iterations should also increase.

The output of the minimization is the alpha-coefficients. The procedure starts with an initial value and after each (sub) iteration the input coefficients are updated. Also, once each stage of iteration as specified above is done, its output coefficients should travel to the start of the next stage. This can be done by a single line in Scala:

 val finalCoefficients = registrationParameters.foldLeft(initialCoefficients)((modelCoefficients, regParameters) =>
      doRegistration(posteriorGP, refMesh, targetMeshes(n), regParameters, modelCoefficients))

The function doRegistration is elucidated in the package file.

Finally, the minimizing coefficients are calculated and stored in finalCoefficients.

6- The fitted mesh

The coefficients are now used to deform the reference mesh (all points) onto a fitted mesh approximating the target mesh. The following two lines calculates the deformation field based on \mu+\sum_{i=1}^{r}\alpha_i\sqrt{\lambda_i}\phi_i with the calculated \alpha_i‘s.

 val transformationSpace = GaussianProcessTransformationSpace(posteriorGP)
 val registrationTransformation = transformationSpace.transformForParameters(finalCoefficients)

Applying the transformation on the reference mesh to get the fitted mesh:

 val fittedMesh = refMesh.transform(registrationTransformation)

7- Visualising the fitted mesh

Here, the fitted mesh on to each target mesh is visualised.

 val fittedMeshGroup =  ui.createGroup("fittedMeshGroup")
 val fittedMeshView = ui.show(fittedMeshGroup, fittedMesh, "fittedMesh")
 val targetMeshView = ui.show(fittedMeshGroup, targetMeshes(n), s"TargetMesh $n")

The fitted mesh on to the first target mesh:

Fig.5. Fitted mesh tot he first target mesh.

To visually assess the accuracy of the fit, we can scroll through the cross sectional contours (normal to the coordinate system’s axes) of the fitted and the target meshes simultaneously. For example:

Fig. 6. Some cross sectional contours.

Based on the visualization of the cross sectional contours, the fitted mesh closely approximates the target mesh except at some end cross sections where local geometrical variations are relatively steep.

The other target mesh is approximated/fitted as below:

Fig.7. Fitted mesh tot he second target mesh.

The worst mismatches of cross sectional contours for this case.

Fig. 8. Some cross sectional contour mismatches.

To accept or not to accept a fitted mesh; this is the question.

It depends on how much discrepancy is allowed in a particular application. Nonetheless, life goes on regardless of our choice. If we are picky, we can play with the kernel (and the registration parameters) to get a better fit.

Average distance and the Hausdorff distance are available as quantitative measures of fit, in Scalismo:

 val AveDist  = scalismo.mesh.MeshMetrics.avgDistance(refMesh, fittedMesh)
 val HsdrfDist  = scalismo.mesh.MeshMetrics.hausdorffDistance(refMesh, fittedMesh)
 println(s"\nAverage distance is $AveDist")
 println(s"\nHsdrf distance is $HsdrfDist \n")

Distance of first target mesh to its fitted mesh:
Average: 1.2 mm
Hausdorff: 3.9 mm
Distance of second target mesh to its fitted mesh:
Average: 1.4 mm
Hausdorff : 4.9 mm

8- Writing the fitted mesh into a file

Before writing the fitted meshes to files, we can observe that the reference mesh and each fitted mesh have the same number of nodes (here, 4470); indeed all in (one-to-one) correspondence. Use the following to access the attribute containing the number of points of the reference mesh (or any other ones).

 refMesh.pointSet.numberOfPoints

Writing to STL files:

 val fittedMeshFile =  new java.io.File(registeredMeshDir + s"ftdMesh$n.stl")
 MeshIO.writeSTL(fittedMesh, fittedMeshFile)

9- Finishing the code

We close the for-loop and use some lines to exit the run:

// to remove the groupp view after each fitting procedure is done 
 removeGroupView(fittedMeshGroup) 

 } //closes the for-loop

 val finito = readLine("finito [y/n]? ")
 if (finito == "y") ui.close()

} // closes the object

[1] Gaussian Process Morphable Models, Marcel Luthi , Thomas Gerig, Christoph Jud, and Thomas Vetter. IEEE Transactions on Pattern Analysis and Machine Intelligence (Volume: 40 , Issue: 8 , Aug. 1 2018).

Appendix: package

package custom

import scalismo.geometry._
import scalismo.common._
import scalismo.ui.api._
import scalismo.mesh._
import scalismo.registration.LandmarkRegistration
import scalismo.io.{MeshIO, LandmarkIO}
import scalismo.numerics.UniformMeshSampler3D
import breeze.linalg.{DenseMatrix, DenseVector, csvread}
import scalismo.registration.{Transformation, RotationTransform, TranslationTransform, RigidTransformation}
import scala.io.StdIn.{readLine, readInt}
import scalismo.numerics._
import scalismo.kernels._
import scalismo.statisticalmodel._
import scalismo.registration._

package object methods {
  implicit val rng = scalismo.utils.Random(42)
  // removing group view
  def removeGroupView(groupView:scalismo.ui.api.Group) : Unit = {
    val s = readLine(s"remove ${groupView.name} and/orJust continue [y/n]? ")
    if (s == "y") groupView.remove()
  }

  // registration

  case class RegistrationParameters(regularizationWeight : Double, numberOfIterations : Int, numberOfSampledPoints : Int)

  def doRegistration(
                      lowRankGP : LowRankGaussianProcess[_3D, EuclideanVector[_3D]],
                      referenceMesh : TriangleMesh[_3D],
                      targetMesh : TriangleMesh[_3D],
                      registrationParameters : RegistrationParameters,
                      initialCoefficients : DenseVector[Double]
                    ) : DenseVector[Double] =
  {
    val transformationSpace = GaussianProcessTransformationSpace(lowRankGP)
    val fixedImage = referenceMesh.operations.toDistanceImage
    val movingImage = targetMesh.operations.toDistanceImage
    val sampler = FixedPointsUniformMeshSampler3D(referenceMesh, registrationParameters.numberOfSampledPoints)
    val metric = MeanSquaresMetric(fixedImage,movingImage,transformationSpace,sampler)
    val optimizer = LBFGSOptimizer(registrationParameters.numberOfIterations)
    val regularizer = L2Regularizer(transformationSpace)
    val registration = Registration(
      metric,
      regularizer,
      registrationParameters.regularizationWeight,
      optimizer
    )
    val registrationIterator = registration.iterator(initialCoefficients)
    val visualizingRegistrationIterator = for ((it, itnum) <- registrationIterator.zipWithIndex) yield {
      println(s"object value in iteration $itnum is ${it.value}")
      //gpView.coefficients = it.parameters
      it
    }
    val registrationResult = visualizingRegistrationIterator.toSeq.last
    registrationResult.parameters
  }

}