Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for element type: 'video' | 'canvas' and control of that element. #355

Open
jewel998 opened this issue Dec 15, 2024 · 8 comments
Open

Comments

@jewel998
Copy link

Describe the solution you'd like
If we could get control over the video or canvas element while the element is being shown, it could help us with several features like rendering AR, VR, Effects, Filters functionality.

Describe alternatives you've considered
It would provide us good support for development if MediaStream would be available, in that case, we can use several other features passing it through Web Tensorflow models to benefit better.

@Saqib92
Copy link

Saqib92 commented Jan 14, 2025

@jewel998
Copy link
Author

jewel998 commented Jan 15, 2025

This is not supported for Capacitor 6. It's quite old and no changes have been made since 3 years.

@jewel998
Copy link
Author

@Saqib92 thanks, I'll come up with a new capacitor plugin for camera reusing the code and update it to capacitor.

@jewel998
Copy link
Author

@Saqib92 just FYI, the problem with this is the latency caused by high quality, there seems be to be a lot of frame delays or drops, which is not feasible to be able to reciprocate. I've tried it for android in this plugin. But the delay is way too much for me to proceed in this direction.

@jewel998 jewel998 reopened this Jan 16, 2025
@Saqib92
Copy link

Saqib92 commented Jan 16, 2025

@Saqib92 just FYI, the problem with this is the latency caused by high quality, there seems be to be a lot of frame delays or drops, which is not feasible to be able to reciprocate. I've tried it for android in this plugin. But the delay is way too much for me to proceed in this direction.

Yes, the plugin is old and i used it in my app back in 2022 and it worked. I used it with tensorflowjs to show skeleton on users body. Now i am also looking for updates as i have no experience in Native development. Hope you can manage to make it work. Good luck.

@jewel998
Copy link
Author

I'm also looking for the same kind of use case.
The probable solution I have in my mind is to be able to integrate some AI Model that detects facial or positional landmarks somewhat like Google Sceneform, and has an API to be able to manipulate the GlSurfaceView to render GLTF or obj models.
Previously I had integrated Google Mediapipe but that didn't help a lot when rendering GLTF or OBJ models with ease.

@bokzor
Copy link

bokzor commented Jan 25, 2025

  • AR Component: Overlays images on camera preview
  • Features:
    • Real-time camera preview with image overlay
    • Camera switching (front/rear)
    • Photo capture
    • Cross-platform support (web/native)
  • Uses Capacitor Camera Preview plugin for native integration
  • and canvas for rendering
import {
  AfterViewInit,
  ChangeDetectionStrategy,
  Component,
  ElementRef,
  input,
  OnDestroy,
  output,
  viewChild,
  effect,
  signal,
} from '@angular/core';
import { NgIf } from '@angular/common';
import { CameraPreview } from '@capacitor-community/camera-preview';
import { SoundTypeEnum } from '../../models/Sound';
import { PlaysSoundOnClickDirective } from '../../directives/plays-sound-on-click.directive';
import { Capacitor } from '@capacitor/core';

export interface OverlayOptions {
  image: string | undefined;
  top: number;
  height: number;
  left: number;
  width: number;
}

export type FacingMode = 'user' | 'environment';

@Component({
  selector: 'app-ar',
  imports: [PlaysSoundOnClickDirective, NgIf],
  templateUrl: './ar.component.html',
  styleUrl: './ar.component.scss',
  changeDetection: ChangeDetectionStrategy.OnPush,
})
export class ArComponent implements OnDestroy, AfterViewInit {
  readonly overlayOptions = input<OverlayOptions | null>(null);
  readonly cameraOrientation = input<FacingMode>('environment');
  readonly mainCanvas = viewChild<ElementRef<HTMLCanvasElement>>('mainCanvas');
  readonly imageTakenChange = output<string | null>();

  // Convert to signals for UI bindings
  readonly currentOrientation = signal<FacingMode>('environment');
  readonly animationFrameId = signal<number | null>(null);
  readonly imageTaken = signal<string | null>(null);
  readonly showLoader = signal<boolean>(true);
  readonly isPreviewActive = signal<boolean>(false);

  protected readonly SoundTypeEnum = SoundTypeEnum;
  private pngImage = new Image();

  private cameraPreviewOpts = {
    position: this.cameraOrientation() === 'environment' ? 'rear' : 'front',
    parent: 'camera-preview',
    className: 'camera-preview',
    lockAndroidOrientation: true,
    width: window.innerWidth,
    height: window.innerWidth,
    x: 0,
    y: 0,
    toBack: true,
    paddingBottom: 0,
    rotateWhenOrientationChanged: true,
    storeToFile: false,
    disableExifHeaderStripping: false,
  };

  constructor() {
    effect(() => {
      const options = this.overlayOptions();
      if (options) {
        this.preloadAnnotationImage();
      }
    });
  }

  async ngAfterViewInit(): Promise<void> {
    this.currentOrientation.set(this.cameraOrientation());

    // Set canvas dimensions
    const mainCanvas = this.mainCanvas();
    if (mainCanvas) {
      mainCanvas.nativeElement.width = window.innerWidth;
      mainCanvas.nativeElement.height = window.innerWidth;
    }

    await this.startCamera();
    this.startRenderingLoop();
  }

  async startCamera() {
    try {
      // Start the preview first
      await CameraPreview.start(this.cameraPreviewOpts);
      this.isPreviewActive.set(true);

      // Get the preview element
      const previewElement = document.getElementById('camera-preview');
      if (previewElement) {
        // Move it off-screen but keep it visible
        previewElement.style.position = 'absolute';
        previewElement.style.top = '-9999px';
        previewElement.style.opacity = '1';
        previewElement.style.display = 'block';
      }

      this.showLoader.set(false);
    } catch (err) {
      console.error('Error starting camera preview:', err);
      this.showLoader.set(false);
    }
  }

  async startRenderingLoop() {
    const isNative = Capacitor.isNativePlatform();

    const drawFrame = async () => {
      if (!this.isPreviewActive()) return;

      const mainCanvas = this.mainCanvas();
      if (!mainCanvas) return;

      const ctx = mainCanvas.nativeElement.getContext('2d');
      if (!ctx) return;

      try {
        if (isNative) {
          // Native platforms: use captureSample
          const result = await CameraPreview.captureSample({
            quality: 50,
          });

          if (result.value) {
            const img = new Image();
            img.src = `data:image/jpeg;base64,${result.value}`;

            await new Promise<void>((resolve) => {
              img.onload = () => {
                this.drawToCanvas(ctx, img, mainCanvas.nativeElement);
                resolve();
              };
            });
          }
        } else {
          // Web: use video element
          const previewElement = document.getElementById('camera-preview');
          if (previewElement) {
            const videoElement = previewElement.querySelector('video');
            if (videoElement) {
              this.drawToCanvas(ctx, videoElement, mainCanvas.nativeElement);
            }
          }
        }

        // Request the next frame
        const frameId = requestAnimationFrame(() =>
          this.drawNextFrame(drawFrame),
        );
        this.animationFrameId.set(frameId);
      } catch (error) {
        console.error('Error in rendering loop:', error);
        const frameId = requestAnimationFrame(() =>
          this.drawNextFrame(drawFrame),
        );
        this.animationFrameId.set(frameId);
      }
    };

    // Start the rendering loop
    if (isNative) {
      await drawFrame();
    } else {
      drawFrame();
    }
  }

  private drawToCanvas(
    ctx: CanvasRenderingContext2D,
    source: HTMLImageElement | HTMLVideoElement,
    canvas: HTMLCanvasElement,
  ) {
    // Clear the canvas
    ctx.clearRect(0, 0, canvas.width, canvas.height);

    // Draw the camera frame with mirroring if needed
    if (this.currentOrientation() === 'user') {
      ctx.save();
      ctx.scale(-1, 1);
      ctx.drawImage(source, -canvas.width, 0, canvas.width, canvas.height);
      ctx.restore();
    } else {
      ctx.drawImage(source, 0, 0, canvas.width, canvas.height);
    }

    // Draw the annotation if available
    const annotation = this.overlayOptions();
    if (annotation) {
      const newScaleFactor = canvas.width / 600;
      const newPngDetails = {
        top: annotation.top * newScaleFactor,
        left:
          this.currentOrientation() === 'user'
            ? canvas.width -
              annotation.left * newScaleFactor -
              annotation.width * newScaleFactor
            : annotation.left * newScaleFactor,
        width: annotation.width * newScaleFactor,
        height: annotation.height * newScaleFactor,
      };

      ctx.drawImage(
        this.pngImage,
        newPngDetails.left,
        newPngDetails.top,
        newPngDetails.width,
        newPngDetails.height,
      );
    }
  }

  private async drawNextFrame(drawFrame: () => Promise<void>) {
    try {
      await drawFrame();
    } catch (error) {
      console.error('Error in draw frame:', error);
      const frameId = requestAnimationFrame(() =>
        this.drawNextFrame(drawFrame),
      );
      this.animationFrameId.set(frameId);
    }
  }
  async takePicture() {
    const flashElement = document.querySelector('.flash');
    flashElement?.classList.add('shutterClick');

    try {
      // Use the current canvas content as the photo
      const mainCanvas = this.mainCanvas();
      if (mainCanvas) {
        const imageData = mainCanvas.nativeElement.toDataURL('image/jpeg', 0.9);
        this.imageTaken.set(imageData);
        this.imageTakenChange.emit(imageData);
      }
    } catch (error) {
      console.error('Error capturing image:', error);
    } finally {
      setTimeout(() => {
        flashElement?.classList.remove('shutterClick');
      }, 600);
    }
  }

  clearImage() {
    this.imageTaken.set(null);
    this.imageTakenChange.emit(null);
  }

  async switchCamera() {
    const newOrientation =
      this.currentOrientation() === 'user' ? 'environment' : 'user';
    this.currentOrientation.set(newOrientation);
    await CameraPreview.flip();
  }

  async ngOnDestroy() {
    const animationFrameId = this.animationFrameId();
    if (animationFrameId) {
      cancelAnimationFrame(animationFrameId);
    }
    if (this.isPreviewActive()) {
      await CameraPreview.stop();
    }
  }

  async preloadAnnotationImage() {
    return new Promise<void>((resolve) => {
      const overlayOptions = this.overlayOptions();
      if (overlayOptions?.image) {
        console.log('Loading annotation image');
        this.pngImage.src = `${overlayOptions.image}`;
        this.pngImage.setAttribute('crossOrigin', 'anonymous');
        this.pngImage.onload = () => resolve();
        this.pngImage.onerror = () => {
          console.error('Error loading annotation image');
          resolve();
        };
      } else {
        resolve();
      }
    });
  }
}

@jewel998
Copy link
Author

bokzor There is a high frame drop due to the capture sample not being in threaded and having unordered capture sample without the timestamp makes it hard to track promises to render.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants