Skip to content

Random Dumps

tomtung edited this page Nov 17, 2014 · 2 revisions

Algorithm: General Description

We are mostly re-implementing the Chameleon system.

We can think of the system as having two major modes / states: drawing and viewing. The system originally sits in viewing mode. The mesh uses a global MeshFaceMaterial, originally with only one sub-material, with a single-pixel texture.

  • In viewing mode, a draw action triggers the state transition to drawing mode. During the transition, we need to prepare the temporary texture (layer) to draw on:

    • Projecting the existing texture using the standard pipeline: use the same type of renderer with same parameters, but only use ambient light with color white
    • Get the coordinates of all vertices in screen space
    • Temporarily disable the global texture to use the temporary texture (layer) instead, and update UV values of vertices to match their screen-space coordinates, so that the temporary texture is displayed correctly.
      • The easiest way might be to replace the material, changing it from a MeshFaceMaterial to, for example, MeshLambertMaterial
  • In drawing mode, a draw action keeps it in drawing mode.

    • For the moment, let's ignore the boundary detection problem in Figure 4 of the paper.
    • When drawing, each time when the mouse moves to a new location, do a simple Z-buffering at the cursor location to find the triangle under the cursor that is closest to camera. Then, breadth-first search adjacent triangles that are effected by the stroke, ignoring those that can be back-face-culled.
  • In drawing mode, a zoom/rotate/move/resize action triggers the state transition to viewing mode. During the transition, the modified part of the temporary texture should be clipped and merged into the global texture.

  • Clip the texture patch that contains all effected triangles, create a new material, add it to the global MeshFaceMaterial, and set the material of the mesh back to it.

We also need to handle texture exporting correctly, which includes the following steps:

  • Packing patches (algorithm described in the paper) into a new texture file and updating UV values accordingly
  • Exporting the geometry with updated uv values into a new obj file
  • Create aforementioned files on the fly and let the user download them

Some links that might be useful

Interpolation code by Wansui

function getUV(n1, n2, n3, fn, point){
				var v1 = ball.geometry.vertices[n1];
				var v2 = ball.geometry.vertices[n2];
				var v3 = ball.geometry.vertices[n3];
				
				var uv1 = ball.geometry.faceVertexUvs[fn][0];
				var uv2 = ball.geometry.faceVertexUvs[fn][1];
				var uv3 = ball.geometry.faceVertexUvs[fn][2];
				
				var f1 = new THREE.Vector3();
				f1.copy(v1);
				f1.sub(point);
				var f2 = new THREE.Vector3();
				f2.copy(v2);
				f2.sub(point);
				var f3 = new THREE.Vector3();
				f3.copy(v3);
				f3.sub(point);
				
				var p12 = new THREE.Vector3();
				p12.subVector(v1, v2);
				var p13 = new THREE.Vector3();
				p13.subVector(v1, v3);
				var vec = new THREE.Vector3();
				var a = vec.crossVectors(p12, p13).length();
				var a1 = vec.crossVectors(f2, f3).length() / a;
				var a2 = vec.crossVectors(f3, f1).length() / a;
				var a3 = vec.crossVectors(f1, f2).length() / a;
				
				var uv = new THREE.Vector2();
				uv = uv1.multiplyScalar(a1);
				uv.add(uv2.multiplyScalar(a2));
				uv.add(uv3.multiplyScalar(a3));
				return uv;
			}
Clone this wiki locally