vpuppr icon indicating copy to clipboard operation
vpuppr copied to clipboard

Relay other people's tracking data

Open fire opened this issue 4 years ago • 5 comments

Pull in alongside your own (display 2+ models at once).

fire avatar Jun 12 '21 22:06 fire

This should be doable by:

  1. Splitting out the UDP receiver from utils/OpenSeeFace.gd
  2. Rebroadcasting the corrected localhost data from OpenSeeFace (face tracker)
  3. Some sort of signaling server might be needed in order to facilitate a p2p connection?

Once this is completed, it should also make it possible to run the face tracker on a different computer on your local network. Since the UDP receiver would be decoupled, we could actually switch out face tracker runtimes which might make it possible to use face tracking data from an iPhone or Android device.

Not necessary for a 1.0.0 release but nice to have.

you-win avatar Jun 13 '21 01:06 you-win

There's already a standard format using OSC. Not sure of the details.

  • https://github.com/infosia/vmc2bvh
  • Although the BVH data has no blend shapes, we have the data.

fire avatar Jun 13 '21 02:06 fire

Is the suggestion to make bvh the standard data format?

Sending data from OpenSeeFace -> OpenSeeFace is easy enough, just repack the data in an equivalent buffer. I think once that's handled and support for alternative face trackers is possible, I can start looking into an intermediate data layer.

you-win avatar Jun 13 '21 03:06 you-win

I mean VMC.

I don't speak Japanese that well. https://protocol.vmc.info/specification

One approach is reuse the vmc2bvh module and directly connect godot engine. VMC to Godot

See https://apps.apple.com/us/app/waidayo/id1513166077

fire avatar Jun 13 '21 03:06 fire

Basic approach

class PoseCapture {
protected:
	static void _bind_methods();

public:
	virtual Ref<PoseCaptureInstance> instance() override;

	void set_buffer_length(float p_buffer_length_seconds);
	float get_buffer_length();

	bool can_get_buffer(int p_frames) const;
	PackedVector<Ref<Animation>> get_buffer(int p_len); 
	void clear_buffer();

	int get_frames_available() const;
	int64_t get_discarded_frames() const;
	int get_buffer_length_frames() const;
	int64_t get_pushed_frames() const;
};

See https://github.com/godotengine/godot/blob/6f7d45d2109246e3888fc2b16136915e6fec89fd/scene/resources/animation.h#L67 for the different types of tracks.

We use nodepath, the key and time (absolute time).

<Lyuma> If you're on 802.11ac, you can assume latency no more than 5-10ms , but some older wifi specs can have substantially higher jitter: 50ms+ is possible.

fire avatar Jun 13 '21 04:06 fire