如何使用打字稿用音频分量创建波形?

时间:2019-02-08 14:15:09

标签: reactjs typescript audio

我不知道如何使用react和typescript使组件与Web Audio API一起使用

我要使用的示例是https://github.com/philnash/react-web-audio,它是用JavaScript编写的。我刚刚重新编写了它以适合我的react / typescript环境。问题是,作者正在使用外围输入设备来获取音频提要。我正在使用带有URL作为源的组件,它返回了HTMLAudioElement。我不知道如何从HTMLAudioElement转到所需的MediaStream。我尝试使用srcObject,但始终为null。

文件MyAudio.tsx

import * as React from 'react';
import {fromNullable, none, Option} from 'fp-ts/lib/Option';
import {AudioAnalyser} from './AudioAnalyser';

export type MyAudioProps = {
onAudioError?: (e: any) => void;
};

type State = {
audioElement: Option<HTMLAudioElement>;
};

export class MyAudio extends React.PureComponent<MyAudioProps, State> {
constructor (props: MyAudioProps) {
    super(props);
    this.state = {audioElement: none};
}
private blobUrl = 'https://archive.org/download/AbdulAzizSurahFateha/'
+ '001Www.quranaudio.info.mp3';

private readonly audioRef = (el: HTMLAudioElement | null) => {
    this.state.audioElement.fold(
    () => {
        this.setState({audioElement: fromNullable(el)});
    },
    audioEl => {}
    );
}

render() {
    const audioElement = this.state.audioElement.toUndefined();
    // This is alway null - what do I have to do to get a stream?
    const srcObject = audioElement ? audioElement.srcObject : undefined;
    return (
    <div>
        <audio controls ref={this.audioRef} src={this.blobUrl}/>
        {
            srcObject && srcObject instanceof MediaStream
                ? <AudioAnalyser
                    audio={srcObject}
                  />
                : null
        }
    </div>
    );
}
}

文件AudioAnalyser.tsx

import * as React from 'react';
import {AudioVisualiser} from './AudioVisualiser';

type Props = {
audio: MediaStream
};

type State = {
audioData: Uint8Array

};

export class AudioAnalyser extends React.PureComponent<Props, State> {
audioContext: AudioContext;
analyser: AnalyserNode;
dataArray: Uint8Array;
source: MediaStreamAudioSourceNode;
rafId: number;

constructor(props: Props) {
    super(props);
    this.state = { audioData: new Uint8Array(0) };
    this.tick = this.tick.bind(this);
}

componentDidMount() {
    this.audioContext = new (AudioContext || webkitAudioContext)();
    this.analyser = this.audioContext.createAnalyser();
    this.dataArray = new Uint8Array(this.analyser.frequencyBinCount);
    this.source = this.audioContext.createMediaStreamSource(this.props.audio);
    this.source.connect(this.analyser);
    this.rafId = requestAnimationFrame(this.tick);
}

tick() {
    this.analyser.getByteTimeDomainData(this.dataArray);
    this.setState({ audioData: this.dataArray });
    this.rafId = requestAnimationFrame(this.tick);
}

componentWillUnmount() {
    cancelAnimationFrame(this.rafId);
    this.analyser.disconnect();
    this.source.disconnect();
}

render() {
    return <AudioVisualiser audioData={this.state.audioData} />;
}
}

文件AudioVisualiser.tsx

import * as React from 'react';
import {fromNullable, none, Option} from 'fp-ts/lib/Option';

export type Props =  {
audioData: Uint8Array
};

export class AudioVisualiser extends React.PureComponent<Props> {
canvas: Option<HTMLCanvasElement> = none;
constructor(props: Props) {
    super(props);
}

componentDidUpdate() {
    this.draw();
}

private readonly canvasRef = (el: HTMLCanvasElement | null) => {
    this.canvas = fromNullable(el);
}
draw() {
    const { audioData } = this.props;
    this.canvas.fold(
    () => {},
    canvas => {

        // const canvas = canvasElement.current;
        const height = canvas.height;
        const width = canvas.width;
        const context = canvas.getContext('2d');
        if (context) {
        let x = 0;
        const sliceWidth = (width * 1.0) / audioData.length;

        context.lineWidth = 2;
        context.strokeStyle = '#000000';
        context.clearRect(0, 0, width, height);

        context.beginPath();
        context.moveTo(0, height / 2);
        audioData.forEach(item => {
            const y = (item / 255.0) * height;
            context.lineTo(x, y);
            x += sliceWidth;
        });
        context.lineTo(x, height / 2);
        context.stroke();
        }
    }
    );
}

render() {
    return <canvas width='600' height='100' ref={this.canvasRef} />;
}
}

当我传递srcObject时,它始终为null。如果我尝试像看过的其他示例一样传递音频参考,则会收到打字稿类型错误,因为它是HTMLAudioElement而不是MediaStream。如果将类型替换为“ any”,则由于它不是MediaStream,因此会出现运行时错误。

我不知道如何获取实际音频的MediaStream。

0 个答案:

没有答案