Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Material Components (uv-scroll, video-texture-target, video-texture-source) #5911

Merged
merged 12 commits into from
Jan 30, 2023
3 changes: 3 additions & 0 deletions src/app.ts
Original file line number Diff line number Diff line change
Expand Up @@ -13,6 +13,7 @@ import { EffectComposer, EffectPass } from "postprocessing";
import {
Audio,
AudioListener,
Material,
Object3D,
PerspectiveCamera,
PositionalAudio,
Expand Down Expand Up @@ -48,6 +49,7 @@ export interface HubsWorld extends IWorld {
deletedNids: Set<number>;
nid2eid: Map<number, number>;
eid2obj: Map<number, Object3D>;
eid2mat: Map<number, Material>;
time: { delta: number; elapsed: number; tick: number };
}

Expand Down Expand Up @@ -113,6 +115,7 @@ export class App {
this.store = store;
// TODO: Create accessor / update methods for these maps / set
this.world.eid2obj = new Map();
this.world.eid2mat = new Map();
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I (wrongly) assumed that the networking system would be making use of eid2obj, but this doesn't seem to be the case. I'd love to see a NetworkedUVScroll.

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I think the way I would network sync a lot of the animation type stuff is just using serverTime instead. But your point stands it should be completely trivial to network stuff about material entities too, they are not special.

Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Network IDs would need to be assigned (deterministically) to materials in setNetworkedData...

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Ah yeah that makes sense. That makes things a bit more tricky but still seems doable.


this.world.nid2eid = new Map();
this.world.deletedNids = new Set();
Expand Down
14 changes: 14 additions & 0 deletions src/bit-components.js
Original file line number Diff line number Diff line change
Expand Up @@ -198,4 +198,18 @@ export const ObjectSpawner = defineComponent({
export const Billboard = defineComponent({
onlyY: Types.ui8
});
export const MaterialTag = defineComponent();
export const UVScroll = defineComponent({
speed: [Types.f32, 2],
increment: [Types.f32, 2],
offset: [Types.f32, 2]
});
export const VideoTextureSource = defineComponent({
fps: Types.ui8,
resolution: [Types.ui16, 2]
});
export const VideoTextureTarget = defineComponent({
source: Types.eid,
flags: Types.ui8
});
export const SimpleWater = defineComponent();
64 changes: 5 additions & 59 deletions src/bit-systems/camera-tool.js
Original file line number Diff line number Diff line change
Expand Up @@ -19,6 +19,7 @@ import { SOUND_CAMERA_TOOL_COUNTDOWN, SOUND_CAMERA_TOOL_TOOK_SNAPSHOT } from "..
import { paths } from "../systems/userinput/paths";
import { ObjectTypes } from "../object-types";
import { anyEntityWith } from "../utils/bit-utils";
import { updateRenderTarget } from "./video-texture";

// Prefer h264 if available due to faster decoding speec on most platforms
const videoCodec = ["h264", "vp9,opus", "vp8,opus", "vp9", "vp8"].find(
Expand Down Expand Up @@ -129,64 +130,6 @@ function endRecording(world, camera, cancel) {
APP.hubChannel.endRecording();
}

function updateRenderTarget(world, camera) {
const sceneEl = AFRAME.scenes[0];
const renderer = AFRAME.scenes[0].renderer;

const tmpVRFlag = renderer.xr.enabled;
renderer.xr.enabled = false;

// TODO we are doing this because aframe uses this hook for tock.
// Namely to capture what camera was rendering. We don't actually use that in any of our tocks.
// Also tock can likely go away as a concept since we can just direclty order things after render in raf if we want to.
const tmpOnAfterRender = sceneEl.object3D.onAfterRender;
delete sceneEl.object3D.onAfterRender;

// TODO this assumption is now not true since we are not running after render. We should probably just permentently turn of autoUpdate and run matrix updates at a point we wnat to.
// The entire scene graph matrices should already be updated
// in tick(). They don't need to be recomputed again in tock().
const tmpAutoUpdate = sceneEl.object3D.autoUpdate;
sceneEl.object3D.autoUpdate = false;

const bubbleSystem = AFRAME.scenes[0].systems["personal-space-bubble"];
const boneVisibilitySystem = AFRAME.scenes[0].systems["hubs-systems"].boneVisibilitySystem;

if (bubbleSystem) {
for (let i = 0, l = bubbleSystem.invaders.length; i < l; i++) {
bubbleSystem.invaders[i].disable();
}
// HACK, bone visibility typically takes a tick to update, but since we want to be able
// to have enable() and disable() be reflected this frame, we need to do it immediately.
boneVisibilitySystem.tick();
// scene.autoUpdate will be false so explicitly update the world matrices
boneVisibilitySystem.updateMatrices();
}

const renderTarget = renderTargets.get(camera);
renderTarget.needsUpdate = false;
renderTarget.lastUpdated = world.time.elapsed;

const tmpRenderTarget = renderer.getRenderTarget();
renderer.setRenderTarget(renderTarget);
renderer.clearDepth();
renderer.render(sceneEl.object3D, world.eid2obj.get(CameraTool.cameraRef[camera]));
renderer.setRenderTarget(tmpRenderTarget);

renderer.xr.enabled = tmpVRFlag;
sceneEl.object3D.onAfterRender = tmpOnAfterRender;
sceneEl.object3D.autoUpdate = tmpAutoUpdate;

if (bubbleSystem) {
for (let i = 0, l = bubbleSystem.invaders.length; i < l; i++) {
bubbleSystem.invaders[i].enable();
}
// HACK, bone visibility typically takes a tick to update, but since we want to be able
// to have enable() and disable() be reflected this frame, we need to do it immediately.
boneVisibilitySystem.tick();
boneVisibilitySystem.updateMatrices();
}
}

function updateUI(world, camera) {
const snapMenuObj = world.eid2obj.get(CameraTool.snapMenuRef[camera]);
const snapBtnObj = world.eid2obj.get(CameraTool.snapRef[camera]);
Expand Down Expand Up @@ -404,7 +347,10 @@ export function cameraToolSystem(world) {
world.time.tick % allCameras.length === i &&
elapsed > renderTarget.lastUpdated + VIEWFINDER_UPDATE_RATE)
) {
updateRenderTarget(world, camera);
// TODO camera tool may be able to just direclty use video-texture-target/source
updateRenderTarget(world, renderTarget, CameraTool.cameraRef[camera]);
renderTarget.needsUpdate = false;
renderTarget.lastUpdated = world.time.elapsed;
if (CameraTool.state[camera] === CAMERA_STATE.RECORDING_VIDEO) {
videoRecorders.get(camera).captureFrame(renderTargets.get(camera));
}
Expand Down
55 changes: 55 additions & 0 deletions src/bit-systems/uv-scroll.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,55 @@
import { addComponent, defineQuery, hasComponent, removeComponent } from "bitecs";
import { Material, Mesh, MeshBasicMaterial } from "three";
import { HubsWorld } from "../app";
import { MaterialTag, Object3DTag, UVScroll } from "../bit-components";
import { mapMaterials } from "../utils/material-utils";

// We wanted uv-scroll to be a component on materials not objects. The original uv-scroll component predated
// the concept of "material components". Also in AFRAME, material components ended up being on objects anyway
// so it didn't make a practical difference. This corrects the behaviour to act how we want.
const uvScrollObjectsQuery = defineQuery([UVScroll, Object3DTag]);
function migrateLegacyComponents(world: HubsWorld) {
uvScrollObjectsQuery(world).forEach(function (eid) {
const obj = world.eid2obj.get(eid)!;
const mat = (obj as Mesh).material;
// TODO We will warn once we modify the Blender addon to put these components on materials instead
// console.warn(
// "The uv-scroll component should be added directly to materials not objects, transferring to object's material"
// );
if (!mat) {
console.error("uv-scroll component added to an object without a Material");
} else {
mapMaterials(obj, function (mat: Material) {
if (hasComponent(world, UVScroll, mat.eid!)) {
console.warn(
"Multiple uv-scroll instances added to objects sharing a material, only the speed/increment from the first one will have any effect"
);
} else {
addComponent(world, UVScroll, mat.eid!);
UVScroll.speed[mat.eid!].set(UVScroll.speed[eid]);
UVScroll.increment[mat.eid!].set(UVScroll.increment[eid]);
}
});
}
removeComponent(world, UVScroll, eid);
});
}

const uvScrollQuery = defineQuery([UVScroll, MaterialTag]);
export function uvScrollSystem(world: HubsWorld) {
migrateLegacyComponents(world);
uvScrollQuery(world).forEach(function (eid) {
const map = (world.eid2mat.get(eid)! as MeshBasicMaterial).map;
if (!map) return; // This would not exactly be expected to happen but is not a "bug" either. There is just no work to do in this case.
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

I'm confused why this wouldn't be a bug. Is it valid to attach a UVScroll (material) component to something that isn't a material? Alternatively, is it valid to have a material eid that is not in eid2mat?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

No. It must be on a material, and the material will be in the map (hence the !). What is not guaranteed, and I don't think inherently a bug is that a material can choose to not have a map. In practice you would not do this in a GLTF file, but I could see cases where we would be UV scrolling something and changing its map. Maybe turning it off? Not sure. Maybe that is not a real usecase.


const offset = UVScroll.offset[eid];
const speed = UVScroll.speed[eid];
const scale = world.time.delta / 1000;
offset[0] = (offset[0] + speed[0] * scale) % 1.0;
offset[1] = (offset[1] + speed[1] * scale) % 1.0;

const increment = UVScroll.increment[eid];
map.offset.x = increment[0] ? offset[0] - (offset[0] % increment[0]) : offset[0];
map.offset.y = increment[1] ? offset[1] - (offset[1] % increment[1]) : offset[1];
});
}
209 changes: 209 additions & 0 deletions src/bit-systems/video-texture.ts
Original file line number Diff line number Diff line change
@@ -0,0 +1,209 @@
import { defineQuery, enterQuery, entityExists, exitQuery, removeComponent } from "bitecs";
import {
Camera,
LinearFilter,
Material,
MeshStandardMaterial,
NearestFilter,
PerspectiveCamera,
RGBAFormat,
sRGBEncoding,
Texture,
WebGLRenderTarget
} from "three";
import { HubsWorld } from "../app";
import { MaterialTag, VideoTextureSource, VideoTextureTarget } from "../bit-components";
import { Layers } from "../camera-layers";
import { VIDEO_TEXTURE_TARGET_FLAGS } from "../inflators/video-texture-target";
import { EntityID } from "../utils/networking-types";
import { findNode } from "../utils/three-utils";

interface SourceData {
renderTarget: WebGLRenderTarget;
camera: EntityID;
lastUpdated: number;
needsUpdate: boolean;
}

interface TargetData {
originalMap: Texture | null;
originalEmissiveMap: Texture | null;
originalBeforeRender: typeof Material.prototype.onBeforeRender;
boundTo: EntityID;
}

function noop() {}

export function updateRenderTarget(world: HubsWorld, renderTarget: WebGLRenderTarget, camera: EntityID) {
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This function is tricky. (Not new)

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Yeah I think it will become less tricky if we schedule it in the right spot relative to normal rendering. It might actually already be the case that the bone visibility stuff is no longer needed but I didn't want to revisit that now.

const sceneEl = AFRAME.scenes[0];
const renderer = AFRAME.scenes[0].renderer;

const tmpVRFlag = renderer.xr.enabled;
renderer.xr.enabled = false;

// TODO we are doing this because aframe uses this hook for tock.
// Namely to capture what camera was rendering. We don't actually use that in any of our tocks.
// Also tock can likely go away as a concept since we can just direclty order things after render in raf if we want to.
const tmpOnAfterRender = sceneEl.object3D.onAfterRender;
sceneEl.object3D.onAfterRender = noop;

const bubbleSystem = AFRAME.scenes[0].systems["personal-space-bubble"];
const boneVisibilitySystem = AFRAME.scenes[0].systems["hubs-systems"].boneVisibilitySystem;

if (bubbleSystem) {
for (let i = 0, l = bubbleSystem.invaders.length; i < l; i++) {
bubbleSystem.invaders[i].disable();
}
// HACK, bone visibility typically takes a tick to update, but since we want to be able
// to have enable() and disable() be reflected this frame, we need to do it immediately.
boneVisibilitySystem.tick();
// scene.autoUpdate will be false so explicitly update the world matrices
boneVisibilitySystem.updateMatrices();
}

const tmpRenderTarget = renderer.getRenderTarget();
renderer.setRenderTarget(renderTarget);
renderer.clearDepth();
renderer.render(sceneEl.object3D, world.eid2obj.get(camera)! as Camera);
renderer.setRenderTarget(tmpRenderTarget);

renderer.xr.enabled = tmpVRFlag;
sceneEl.object3D.onAfterRender = tmpOnAfterRender;

if (bubbleSystem) {
for (let i = 0, l = bubbleSystem.invaders.length; i < l; i++) {
bubbleSystem.invaders[i].enable();
}
// HACK, bone visibility typically takes a tick to update, but since we want to be able
// to have enable() and disable() be reflected this frame, we need to do it immediately.
boneVisibilitySystem.tick();
boneVisibilitySystem.updateMatrices();
}
}

function bindMaterial(world: HubsWorld, eid: EntityID) {
const srcData = sourceDataMap.get(VideoTextureTarget.source[eid]);
if (!srcData) {
console.error("video-texture-target unable to find source");
VideoTextureTarget.source[eid] = 0;
return;
}

const mat = world.eid2mat.get(eid)! as MeshStandardMaterial;
const targetData: TargetData = {
originalMap: mat.map,
originalEmissiveMap: mat.emissiveMap,
originalBeforeRender: mat.onBeforeRender,
boundTo: VideoTextureTarget.source[eid]
};
mat.onBeforeRender = function () {
// Only update when a target were in view last frame
// This is safe because this system always runs before render and invalid sources are unbound
sourceDataMap.get(VideoTextureTarget.source[eid])!.needsUpdate = true;
};
if (VideoTextureTarget.flags[eid] & VIDEO_TEXTURE_TARGET_FLAGS.TARGET_BASE_MAP) {
mat.map = srcData.renderTarget.texture;
}
if (VideoTextureTarget.flags[eid] & VIDEO_TEXTURE_TARGET_FLAGS.TARGET_EMISSIVE_MAP) {
mat.emissiveMap = srcData.renderTarget.texture;
}
targetDataMap.set(eid, targetData);
}

function unbindMaterial(world: HubsWorld, eid: EntityID) {
const targetData = targetDataMap.get(eid)!;
const mat = world.eid2mat.get(eid)! as MeshStandardMaterial;
if (VideoTextureTarget.flags[eid] & VIDEO_TEXTURE_TARGET_FLAGS.TARGET_BASE_MAP) {
mat.map = targetData.originalMap;
}
if (VideoTextureTarget.flags[eid] & VIDEO_TEXTURE_TARGET_FLAGS.TARGET_EMISSIVE_MAP) {
mat.emissiveMap = targetData.originalMap;
}
mat.onBeforeRender = targetData.originalBeforeRender;
targetDataMap.delete(eid);
}

const sourceDataMap = new Map<EntityID, SourceData>();
const targetDataMap = new Map<EntityID, TargetData>();

const videoTextureSourceQuery = defineQuery([VideoTextureSource]);
const enteredVideoTextureSourcesQuery = enterQuery(videoTextureSourceQuery);
const exitedVideoTextureSourcesQuery = exitQuery(videoTextureSourceQuery);

const videoTextureTargetQuery = defineQuery([VideoTextureTarget, MaterialTag]);
const exitedVideoTextureTargetsQuery = exitQuery(videoTextureTargetQuery);
export function videoTextureSystem(world: HubsWorld) {
enteredVideoTextureSourcesQuery(world).forEach(function (eid) {
let camera = world.eid2obj.get(eid)! as PerspectiveCamera;
if (!(camera && camera.isCamera)) {
const actualCamera = findNode(camera, (o: any) => o.isCamera);
if (actualCamera) {
console.warn("video-texture-source should be added directly to a camera, not it's ancestor.");
camera = actualCamera as PerspectiveCamera;
} else {
console.error("video-texture-source added to an entity without a camera");
removeComponent(world, VideoTextureSource, eid);
return;
}
}

camera.layers.enable(Layers.CAMERA_LAYER_THIRD_PERSON_ONLY);

const resolution = VideoTextureSource.resolution[eid];
camera.aspect = resolution[0] / resolution[1];

// TODO currently if a video-texture-source tries to render itself it will fail with a warning.
// If we want to support this we will need 2 render targets to swap back and forth.
const renderTarget = new WebGLRenderTarget(resolution[0], resolution[1], {
format: RGBAFormat,
minFilter: LinearFilter,
magFilter: NearestFilter,
encoding: sRGBEncoding
});

// Since we are rendering directly to a texture we need to flip it vertically
// See https://github.com/mozilla/hubs/pull/4126#discussion_r610120237
renderTarget.texture.matrixAutoUpdate = false;
renderTarget.texture.matrix.scale(1, -1);
renderTarget.texture.matrix.translate(0, 1);

sourceDataMap.set(eid, { renderTarget, lastUpdated: 0, camera: camera.eid!, needsUpdate: false });
});
exitedVideoTextureSourcesQuery(world).forEach(function (eid) {
const srcData = sourceDataMap.get(eid);
if (srcData) {
srcData.renderTarget.dispose();
sourceDataMap.delete(eid);
}
});

exitedVideoTextureTargetsQuery(world).forEach(function (eid) {
const isBound = targetDataMap.has(eid);
if (isBound && entityExists(world, eid)) unbindMaterial(world, eid);
targetDataMap.delete(eid);
});
videoTextureTargetQuery(world).forEach(function (eid) {
const source = VideoTextureTarget.source[eid];
const isBound = targetDataMap.has(eid);
if (isBound) {
if (!source || !entityExists(world, source)) {
unbindMaterial(world, eid);
VideoTextureTarget.source[eid] = 0;
} else if (source !== targetDataMap.get(eid)!.boundTo) {
unbindMaterial(world, eid);
bindMaterial(world, eid);
}
} else if (source && entityExists(world, source)) {
bindMaterial(world, eid);
}
});
Copy link
Contributor

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

What happens if the source entity goes away, and this VideoTextureTarget.source[eid] pointed to it? Do we need to check for that and set VideoTextureTarget.source[eid] = 0, is this handled already, or is this not possible / unnecessary?

Copy link
Contributor Author

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

This should be handled just above if (!source || !entityExists(world, source)) {


videoTextureSourceQuery(world).forEach(function (eid) {
const sourceData = sourceDataMap.get(eid)!;
if (sourceData.needsUpdate && world.time.elapsed > sourceData.lastUpdated + 1000 / VideoTextureSource.fps[eid]) {
updateRenderTarget(world, sourceData.renderTarget, sourceData.camera);
sourceData.lastUpdated = world.time.elapsed;
sourceData.needsUpdate = false;
}
});
}
Loading