-
Are there any examples for supplying your own mesh (for example from scipy Delaunay) and extracting basis functions of some degree? For example you might evaluate the basis functions at a set of points for fitting against data as a kind of smoothing spline. |
Beta Was this translation helpful? Give feedback.
Replies: 58 comments
-
There is an example on defining the mesh points and triangles in the README. It is not specific to scipy Delaunay though.
I think there are several examples on interpolating the finite element solution at specific points. One of them is here: https://github.com/kinnala/scikit-fem/blob/master/docs/examples/ex12.py#L39
|
Beta Was this translation helpful? Give feedback.
-
I think scipy Delaunay is something like this: points = np.array([[0, 0], [0, 1.1], [1, 0], [1, 1]])
from scipy.spatial import Delaunay
tri = Delaunay(points)
from skfem import MeshTri
mesh = MeshTri(points.T, tri.simplices.T) |
Beta Was this translation helpful? Give feedback.
-
@kinnala Thanks. Yes that works and I am up to the point of trying to understand why this is blowing up memory under the hood.
I think it should just be evaluating the basis functions and returning something that is sparse 30k x 800ish which is fine. So I obviously must not yet underestand what I am using :) |
Beta Was this translation helpful? Give feedback.
-
I don't use these features very often but I think CellBasis.probes creates some sort of interpolation operator instead of just interpolating at specific points. Maybe CellBasis.interpolator is more useful for your use case?
|
Beta Was this translation helpful? Give feedback.
-
Alright, seems that interpolator uses probes nowadays so no help there. So you want to interpolate at a large number of points at the same time? Could look into further optimizing that specific case unless you want to use a for-loop.
This is not a very natural thing to do in the context of finite elements because you must find the correct element for each point separately which will be a nontrivial process. That part is not very well optimized at the moment, it uses scipy KDTree to find candidate elements and then tests those via inverse reference mapping.
|
Beta Was this translation helpful? Give feedback.
-
This is true, but is it not so much about finite elements as about unstructured meshes? Wouldn't it equally afflict finite volumes? The blow-up is in While, I agree, it's not a very natural thing to do in finite elements, it is a very natural thing to want to do in applications that otherwise have recourse to finite elements; three examples:
I believe FreeFEM has good implementations of the first and last of these, probably using a common facility for interpolation. It is C++ rather than Python and I've refrained from looking at the implementation anyway as it's GPL and so couldn't be copied here. One thing that might be exploited in each of the three examples is that one has a good initial guess for the cell at second and subsequent steps; e.g. the particle-tracking and finding the foot of a characteristic requires time-stepping and a step us likely to be within a cell or to a neighbour through a particular face. Similarly for interpolating between meshes if the new is deformed from the old. Perhaps this is more natural in a finite volume mesh data structure in which the facets of cells are known, but we do already have that: Another thing might be that the 32 000 probes are embarrassingly parallel. (I wouldn't think that that would ease the memory blow-up though, depending on your architecture.) Is your application like one of the above three? |
Beta Was this translation helpful? Give feedback.
-
Of course if you do have a scipy Delaunay object you can use find_simplex. It would be interesting to see how that compares for speed and memory. |
Beta Was this translation helpful? Give feedback.
-
Interpolating between meshes could be further abstracted using something similar to |
Beta Was this translation helpful? Give feedback.
-
@kinnala Ah ok, a for loop is fine for now to try something out ... If I understand correctly there is some interaction effect going on. It's been over a decade I've done anything with FE solveres but on trying to remember this stuff it seems interesting that the computational flows are quite different for the data-assimilation problem vs galerkin DE solving. I might be way off but am thinking of the variational log-likelihood setting as "data-loss" + "smoothing-loss" + boundary conditions as similar to non-linear forced poisson eq. I suspect the weird flow comes in because the data loss requires evaluation at (potentially) off-mesh points but only once (I am considering a fixed mesh in this context). Thanks for the help! |
Beta Was this translation helpful? Give feedback.
-
One could try to hack this method so that the user can pass another query function, e.g., something based on Edit: Maybe it's simpler to inherit Edit 2: Reading |
Beta Was this translation helpful? Give feedback.
-
Actually, to me this part of So I wonder if that could be used for any mesh if we just start from an arbitrary edge. Edit: Shame that there is not any wrapper to call it directly. I think calling |
Beta Was this translation helpful? Give feedback.
-
As I'm reading through things, the other issue I'm seeing is that I'm not sure yet how to create the Laplacian operator directly as I think the problems being dealt with here are typically linear DE and cast to the weak form via integration by parts (this stuff is foggy memory to me so not sure I'm making sense). The data assimilation problem is typically something minimization like this (like a smoothing spline) Even though it is is non-linear, if the grid is fixed I think you can just evaluate the basis functions over the $$ Unless I'm missing some trick I think this is a different kind of operational flow using the same basis and operator discretization tools |
Beta Was this translation helpful? Give feedback.
-
I have no idea what that is, but in finite elements a discrete Laplacian is typically the matrix corresponding to @BilinearForm
def laplacian(u, v, w):
from skfem.helpers import grad, dot
return dot(grad(u), grad(v))
D = laplacian.assemble(basis) # create a Basis object and pass it here |
Beta Was this translation helpful? Give feedback.
-
What is |
Beta Was this translation helpful? Give feedback.
-
And yes, |
Beta Was this translation helpful? Give feedback.
-
And regarding Delaunay, I just used that as a quick example. I think for smoothing applications where one expects the scale of the problem to somehow align with intensity (number) of observations it could make sense to use some delauny mesh. I think the most important thing to keep in mind for ML/smoothing/data assimilation is that the one is less constrained by numerical stability concerns (as with PDE) and more trying to live within a parametric budget. So if I have millions of points on a 2d grid, I might lay down some finite-budget mesh either based on some prior knowledge or based on data density. But my goal is a) control over-fitting and b) use a limited number of dofs. |
Beta Was this translation helpful? Give feedback.
-
One thing I would check though is that points are not duplicated in that ix slicing. I think a np.unique there solves all of this no? |
Beta Was this translation helpful? Give feedback.
-
Also just to confirm and clarify what I would be using this stuff for in case it helps. Suppose I assemble some basis and I have 100 data points corresponding to observations of values of the function. I would assemble this data-matrix "A" once and optimize that plus other terms somehow.
And this example is just some gaussian noise case. But you can have more complicated loss functions. But the reason one turns to FEM is that the work of matching boundaries, handling deriviates etc is quite a bit of accounting to write oneself. I think a lot of the ML community has been focussed on "learning the knots" (the mesh) which turns the problem into a fully non-linear problem (for example RBF networks) and you don't have any advantage in pre-computing these projection matrices since you're basis functions themselves are changing with each iteration. I am trying to stick with fixed meshes for now to aim for faster calibration times. |
Beta Was this translation helpful? Give feedback.
-
Thanks for the Raket paper, I'll take a look; I'm quite at home with Green's functions, so that'll help. (You'll see in #629 & #632 that the adjoint of the operator returned by the If horrid meshes aren't going to be the daily grind, I'd suggest solving an easier case first; it does make a big difference in the finite element method. If you do have the luxury of placing the points of the mesh wherever you like within the region, use something like Gmsh, dmsh, adaptmesh, or triangle, to place them reasonably. |
Beta Was this translation helpful? Give feedback.
-
That might handle the part with lots of input points but not the part with pathological meshes. Sometimes the correct element simply is not part of ix if there are many slit elements. |
Beta Was this translation helpful? Give feedback.
-
Sorry, I just read Raket (2021) and understood it (it's elementary) but didn't see the connexion to the present issue. The only domain considered there is 0 < x < 1, for which one could just use scikit-fem/docs/examples/ex15.py Line 7 in 62882d6 I assume the domains relevant to the application are harder than that. What are they like? How are they specified?
I wonder whether this is like automatic mesh adaptation, as in scikit-fem/docs/examples/ex22.py Line 1 in d08af2a |
Beta Was this translation helpful? Give feedback.
-
@gdmcbain Yes, that paper is just a nice pedagogical linkage between various concepts. I think it was a course originally. The relevant application is super simple. You can of course do data assimilation with any PDE, SPDE etc. There are packages like this https://github.com/alan-turing-institute/stat-fem and there are some papers in the README (https://arxiv.org/abs/1905.06391) for example (have not read it myself). Reason I am attacted to the set up with skfem is the pure python-ness of it so I can easily install or without any heavy extra stuff I do not need in images. If the small branch pr I just pushed is correct it basically solves the large number of points problem for me. It seems to give same results and passes the tests locally. cottrell@f32ca21 I would also forget about the horrid mesh I provided as that is just some bad example I created quickly. I think I am nearly there with my understanding of the layout of things. Will try to assemble a nice working example at the end of this. |
Beta Was this translation helpful? Give feedback.
-
O. K., great. This is sounding good and very interesting. Girolami et al looks like the reference I needed. Thanks very much. I see that statFEM depends on Firedrake. That's an impressive package but it is difficult to install. It was very similar considerations that led me from there to here. On the branch pr, I hope I'm not holding things up by not having reviewed #667 #668; if so I'll get onto that tomorrow. |
Beta Was this translation helpful? Give feedback.
-
No hurry, I'll look into combining #667 and #668 at some point. #668 is missing MeshTet1.element_finder and some additional tests and fixes that I added in #667. I'll try to also evaluate which approach gives better performance. |
Beta Was this translation helpful? Give feedback.
-
Another thing that I'm wondering is if the intention is to include the data term as a bilinear form (with a sum of diracs)? Seems like it would match the flow more ... trying to see if there are examples like this. It might be that probes is still the main target for this kind of flow. One thing I'm also looking for is if there is an established way of doing probes for a derivative or integral of the basis functions. For example if you have observational data about the derivatives instead of the the actual values of the solution. |
Beta Was this translation helpful? Give feedback.
-
Close. A Dirac point source or force is a linear form (RHS, rather than LHS for the bilinear forms). It's actually given by the transpose of https://scikit-fem.readthedocs.io/en/latest/listofexamples.html#example-38-point-source scikit-fem/docs/examples/ex38.py Line 46 in d08af2a But is this exactly what's wanted in your problem? I'm not sure. |
Beta Was this translation helpful? Give feedback.
-
No, there isn't an established way but it's an interesting idea. If it's a functional it should be possible to express it as a matrix ( For integrals there is Really a point-probe is mathematically a functional too but because its expression involves Dirac's δ it needs to be treated specially. Integrals over domains or boundaries or parts thereof are typically much better behaved and can just use |
Beta Was this translation helpful? Give feedback.
-
So I am just kind of remembering (and misremembering) how all this fits together now. I was writing down the attached variational formulation and it made me realize that yes, we could think of a "dirac" basis for the point evalutions as mentioned in the docstring, but it is probably not super useful to literally implement a Form in the code against a Dirac basis. I mean, that is exactly what probes is doing but I'm not sure expressing the probes evaluation as some Form against this particular basis is ever useful for the current solver flows. ... unless of course there are examples with representations in different bases (perhaps the boudary examples) that I am not aware of yet. |
Beta Was this translation helpful? Give feedback.
-
What about splitting the right-hand side? The term (δ(x − s) u (x), h (x)) is a bilinear form and so could go on the left, the term (δ(x − s) y (x), h (x)) is a linear form and stays on the right. |
Beta Was this translation helpful? Give feedback.
-
Since (δ(x − s) u (x), h (x)) = u (s) h (s), the ij-th element of that discretized bilinear form would be the product of the values of the i-th and j-th basis functions at the probe-point s. But I'll leave it at that till I've read Girolami et al; I'm sure they would've sorted all this out already. |
Beta Was this translation helpful? Give feedback.
I think scipy Delaunay is something like this: