The fruits of my investigations into setting up a set of retopologised FACS scans into a usable rig in Blender, including textures that adapt to the face pose by mixing textures with a custom difference-based colour math node driven by the weight of each FACS morph target (called Shape Keys in Blender). Texture mixing is done entirely in Blender shader nodes, completely procedurally. Rendered using my custom Dermis skin shader, with scan data from the 3D Scan Store.
Dermis Shader: https://artstn.co/m/JDln
3D Scan Store FACS data: https://www.3dscanstore.com/3d-head-models/expression-scans/male-01/15-expression-scans-male
HDRis from: https://polyhaven.com/hdris and http://www.hdrlabs.com/sibl/archive.html
Animation Test (Audio + Video Reference source: https://youtu.be/bAxmIxGXMOY)
Music visualizer made by driving shape key factors with an animation nodes setup to make it audio reactive (all animation done procedurally, no keyframes).
Shape Key Test
Shape Key Test Wireframe
Quick proof-of-concept of using empties to drive eye position and shape key factors.
Baked diffuse texture from the animation test (note how textures change to reflect mesh change)
Baked normal texture from the animation test (note how textures change to reflect mesh change)
Baked specular texture from the animation test (note how textures change to reflect mesh change)