blendify.utils package¶
Submodules¶
blendify.utils.camera_trajectory module¶
Code taken from https://github.com/vguzov/cloudrender/blob/main/cloudrender/camera/trajectory.py
- class blendify.utils.camera_trajectory.Trajectory[source]¶
Bases:
object
- add_keypoint(quaternion: Sequence[float], position: Sequence[float], time: float, check_time: bool = True)[source]¶
- refine_trajectory(time_step: float = 0.016666666666666666, interp_type: str = 'quadratic', smoothness: float = 5.0)[source]¶
Refines the trajectory by creating keypoints inbetween existion ones via interpolation
- Parameters:
time_step (float) – how often to create new points
interp_type (str) – interpolation type, “linear”, “quadratic”, “cubic”
smoothness (float) – how hard to smooth the pose trajectory
- Returns:
trajectory keypoints - each keypoint has “time”, “position” and “quaternion”
- Return type:
List[dict]
blendify.utils.image module¶
- blendify.utils.image.blend_with_background(img: ndarray, bkg_color: ndarray | Tuple[float, float, float] = (1.0, 1.0, 1.0)) ndarray [source]¶
Blend the RGBA image with uniform colored background, return RGB image
- Parameters:
img – RGBA foreground image
bkg_color – RGB uniform background color (default is white)
- Returns:
RGB image blended with background
- Return type:
np.ndarray
blendify.utils.pointcloud module¶
- class blendify.utils.pointcloud.PCMeshifier(subsampled_mesh_faces_count=1000000, texture_resolution=(30000, 30000), pc_subsample=None, max_query_size=2000, knn=3, gpu_index=None, bpa_radius=None)[source]¶
Bases:
object
- generate_naive_uvmap(o3d_mesh: TriangleMesh)[source]¶
Will pack faces one after another in the texture, two triangles per square
- Parameters:
o3d_mesh (o3d.geometry.TriangleMesh) – mesh to generate UV map for
- generate_texture_from_pc(o3d_mesh: TriangleMesh, uv_map: ndarray, pc_vertices: ndarray, pc_colors: ndarray, query_block_size: int = 2000)[source]¶
- generate_texture_from_pc_block_by_block(o3d_mesh: TriangleMesh, uv_map: ndarray, pc_vertices: ndarray, pc_colors: ndarray, query_block_size: int = 2000)[source]¶
- np_dtype¶
alias of
float64
- o3d_mesh_from_pc(pc_vertices: ndarray, pc_colors: ndarray | None = None, pc_normals: ndarray | None = None)[source]¶
- torch_dtype = torch.float32¶
- blendify.utils.pointcloud.approximate_colors_from_camera(camera_viewdir: ndarray, vertex_normals: ndarray, per_vertex_color: ndarray | Tuple[float, float, float] | Tuple[float, float, float, float], back_color: ndarray | Tuple[float, float, float] | Tuple[float, float, float, float] = (0.6, 0.6, 0.6))[source]¶
Approximation of visible vertices from camera. PC vertices are colored with their initial color only if they are visible from camera (here we use the approximation of visibility, by calculating the angle between vertex normal and camera’s view direction), otherwise the vertices are colored with back_color.
- Parameters:
camera_viewdir (np.ndarray) – view direction of the camera
vertex_normals (np.ndarray) – per-vertex normals of the point cloud
per_vertex_color (Union[Vector3d, Vector4d]) – colors for the point cloud
back_color (Union[Vector3d, Vector4d], optional) – color for vertices that are not visible from camera. With the approximation of visibility, described above (default: (0.6, 0.6, 0.6))
- Returns:
new per-vertex coloring with invisible vertices colored in back_color
- Return type:
np.ndarray
- blendify.utils.pointcloud.estimate_normals_from_pointcloud(pc_vertices: ndarray, backend='open3d', device='gpu')[source]¶
Approximate PC per-vertex normals using algorithms implemented in opend3d or pytorch3d.
- Parameters:
pc_vertices (np.ndarray) – pointcloud vertices (n_vertices x 3)
backend (str, optional) – backend that is used to estimate normals, pytorch3d and open3d are supported (default: “open3d”)
device (str, optional) – pytorch device (default: “gpu”)
- Returns:
estimated normals n_vertices x 3
- Return type:
np.ndarray
- blendify.utils.pointcloud.estimate_pc_normals_from_mesh(pc_vertices: ndarray, mesh: Trimesh)[source]¶
Approximate PC per-vertex normals from mesh that is registered to PC. For each PC vertex averages vertex normals for 5 nearest mesh vertices.
- Parameters:
pc_vertices (np.ndarray) – pointcloud vertices (n_vertices x 3)
mesh (trimesh.base.Trimesh) – mesh, registered to the PC;
- Returns:
estimated normals n_vertices x 3
- Return type:
np.ndarray
- blendify.utils.pointcloud.meshify_pc(pc_vertices, pc_colors, pc_normals=None, subsampled_mesh_faces_count=1000000, texture_resolution=(30000, 30000), pc_subsample=None, max_query_size=2000, knn=3, gpu_index=None, bpa_radius=None)[source]¶
Turns pointcloud into a subsampled mesh with texture.
- Parameters:
pc_vertices (np.ndarray) – pointcloud vertices (n_vertices x 3)
pc_colors (np.ndarray) – pointcloud colors (n_vertices x 3)
pc_normals (np.ndarray, optional) – pointcloud normals (n_vertices x 3). If None, normals will be estimated from the pointcloud (default: None)
subsampled_mesh_faces_count (int, optional) – number of faces in the subsampled mesh (default: 1_000_000)
texture_resolution (tuple, optional) – texture resolution (default: (30_000, 30_000))
pc_subsample (int, optional) – number of vertices to subsample from the pointcloud (default: None)
max_query_size (int, optional) – maximum query size for texture generation (default: 2_000)
knn (int, optional) – number of nearest neighbors for texture generation (default: 3)
gpu_index (int, optional) – index of the GPU of texture generation stage. If None, CPU will be used (default: None)
bpa_radius (float, optional) – radius for ball pivoting algorithm for mesh reconstruction. If None, radius will be estimated automatically based on PC density
- Returns:
vertices of the mesh (V, 3) np.ndarray: faces of the mesh (F, 3) np.ndarray: per-face UV map of the mesh (F, 3, 2) np.ndarray: texture of the mesh (texture_resolution[0], texture_resolution[1], 3)
- Return type:
np.ndarray
- blendify.utils.pointcloud.meshify_pc_from_file(pc_filepath, output_dir, subsampled_mesh_faces_count=1000000, texture_resolution=(30000, 30000), pc_subsample=None, max_query_size=2000, knn=3, gpu_index=None, bpa_radius=None)[source]¶
- Turns pointcloud into a subsampled mesh with texture. Reads pointcloud from a file, writes results to a folder.
Produces mesh.ply, uv_map.npy and texture.jpg in the target folder
- Parameters:
pc_filepath – Path to a pointcloud
output_dir – Path to a directory which will contain the result
subsampled_mesh_faces_count (int, optional) – number of faces in the subsampled mesh (default: 1_000_000)
texture_resolution (tuple, optional) – texture resolution (default: (30_000, 30_000))
pc_subsample (int, optional) – number of vertices to subsample from the pointcloud (default: None)
max_query_size (int, optional) – maximum query size for texture generation (default: 2_000)
knn (int, optional) – number of nearest neighbors for texture generation (default: 3)
gpu_index (int, optional) – index of the GPU of texture generation stage. If None, CPU will be used (default: None)
bpa_radius (float, optional) – radius in ball pivoting algorithm for mesh reconstruction. If None, radius will be estimated automatically based on PC density
blendify.utils.shadow_catcher module¶
- blendify.utils.shadow_catcher.set_shadow_catcher(obj: str | Renderable, state=False)[source]¶
Set object is_shadow_catcher and various visibility_* properties to make object act as a shadow catcher or revert the changes.
- Parameters:
obj (Union[str, Renderable]) – tag of the Blender object or instance of Renderable
state (bool) – if True make obj a shadow catcher, otherwise make it a regular object
blendify.utils.smpl_wrapper module¶
- class blendify.utils.smpl_wrapper.SMPLWrapper(smpl_root: str, gender: str, shape_params: ndarray, device: device | None = None)[source]¶
Bases:
object
A wrapper for the SMPL model
- get_smpl(pose_params: ndarray, translation_params: ndarray) ndarray [source]¶
Get the SMPL mesh vertices from the target pose and global translation
- Parameters:
pose_params – Pose parameters vector of shape (72)
translation_params – Global translation vector of shape (3)
- Returns:
vertices of SMPL model
- Return type:
np.ndarray