如何将react-native-audio-record录制的音频文件发送到服务器?

时间:2020-08-13 14:46:15

标签: laravel react-native file-upload react-native-android file-get-contents

我需要录制音频并将音频上传到服务器,并且对于录制音频,我正在使用“ react-native-audio-record ”反应本机包。

当我在{strong> Laravel 中一直file_get_contents($request->file('inputFile'))在使用file_get_contents returning 500 internal server error时。

我尝试了 form-data blob对象

这是我的React Native代码以及我用来解决此问题的所有内容:

onStartRecord = async () => {
    this.setState({ isPlaying: false })
    let dirs = RNFetchBlob.fs.dirs
    if (Platform.OS === 'android') {
      try {
        const granted = await PermissionsAndroid.request(
          PermissionsAndroid.PERMISSIONS.WRITE_EXTERNAL_STORAGE,
          {
            title: 'Permissions for write access',
            message: 'Give permission to your storage to write a file',
            buttonPositive: 'ok',
          },
        );
        if (granted === PermissionsAndroid.RESULTS.GRANTED) {
          console.log('You can use the storage');
        } else {
          console.log('permission denied');
          return;
        }
      } catch (err) {
        console.warn(err);
        return;
      }
    }
    if (Platform.OS === 'android') {
      try {
        const granted = await PermissionsAndroid.request(
          PermissionsAndroid.PERMISSIONS.RECORD_AUDIO,
          {
            title: 'Permissions for write access',
            message: 'Give permission to your storage to write a file',
            buttonPositive: 'ok',
          },
        );
        if (granted === PermissionsAndroid.RESULTS.GRANTED) {
          console.log('You can use the camera');
        } else {
          console.log('permission denied');
          return;
        }
      } catch (err) {
        console.warn(err);
        return;
      }
    }
    const path = Platform.select({
      ios: 'hello.m4a',
      //android: dirs.DocumentDir+'/hello.aac',
      android: 'sdcard/hello.mp3',
    });
    const audioSet: AudioSet = {
      // AudioEncoderAndroid: AudioEncoderAndroidType.AAC,
      // AudioSourceAndroid: AudioSourceAndroidType.MIC,
      // AVEncoderAudioQualityKeyIOS: AVEncoderAudioQualityIOSType.high,
      // AVNumberOfChannelsKeyIOS: 2,
      // AVFormatIDKeyIOS: AVEncodingOption.aac,
    };
    //console.log('audioSet', audioSet);
    const uri = await this.audioRecorderPlayer.startRecorder(path);
    console.log("URI => ",uri);


    // RNFS.readFile(uri, 'base64')
    // .then(res =>{
    //   console.log(res);
    // });
    // RNFetchBlob.fs.writeFile(path, base64Str, 'base64');
    // RNFetchBlob.android.actionViewIntent(path, 'application/aac');

    this.audioRecorderPlayer.addRecordBackListener((e: any) => {
      //console.log("E ====>>>>>>>>>",e);
      this.setState({
        recordSecs: e.current_position,
        recordTime: this.audioRecorderPlayer.mmssss(
          Math.floor(e.current_position),
        ),
      });
    });
    //alert(`uri: ${uri}`);
    // var body = new FormData();
    // //console.log("BODY",abc);
    // body.append('file', uri);
    //
    // console.log("+++++++=========body=========++++++",body);

    var body = new FormData();
    //console.log("BODY",abc);
    body.append('inputFile', {
       name: 'sound.mp4',
       type: 'audio/mp3',
       uri: uri
    });
    console.log("+++++++=========body=========++++++",body);
    // console.log("BODY",body);
    // RNFS.readFile(uri, "base64").then(data => {
    //   // binary data
    //   console.log("+++++++=========URI=========++++++",data);
    // });





      // const formData = [];
      // formData.push({
      //   name: "sound",
      //   filename: `sound.mp4`,
      //   data: RNFetchBlob.wrap(uri)
      // });

    const blob = await (await fetch(uri)).blob();
    // const file = new File(this.state.recordTime, `me-at-thevoice${1}.mp3`, {
    //     type: blob.type,
    //     lastModified: Date.now()
    // });
    // console.log("Bolb data file",file);

    var bodyData = new FormData();
    //console.log("BODY",abc);
    bodyData.append('inputFile', { blob });
    //
    // console.log("RNFetchBlob blob",blob);
    // await new Promise(resolve => {
    //  var reader = new FileReader();
    //  reader.readAsDataURL(blob);
    //  reader.onloadend = () => {
    //      var base64data = reader.result;
    //     console.log("reader",reader);
    //     console.log("base64data =--->>>",base64data);
    //      // let pth = path
    //      // RNFetchBlob.fs.writeFile(pth, reader.result.substr(base64data.indexOf(',')+1), 'base64').then((res) => {
    //     //   console.log("RNFetchBlob res",res);
    //      //  blob.close()
    //      //  resolve();
    //      // });
    //
        this.props.setLoader(true);
        this.props.uploadAudio(bodyData).then(result => {
          console.log("this.props.audioRecordingResponse |||||=====|||||",this.props.audioRecordingResponse);
          if (this.props.audioRecordingResponse.success) {
             this.handler('success','Success',this.props.audioRecordingResponse.message);
             // this.refs["sign"].resetImage();
             // this.setState({
             //   signatures: [],
             //   isDragged: false,
             //   signatureCount: 0
             // })
             //this.props.navigation.navigate('AudioRecording',{templateId:templateId, documentId: documentId});

          } else {
            this.props.setLoader(false);
            this.handler('error','Error',this.props.audioRecordingResponse.message);
          }
        })

    //  }
    // })

  };

请让我知道是否有人对此有解决方案。

1 个答案:

答案 0 :(得分:0)

我不确定这是否能回答您的具体情况,但这是我如何从React Native应用发送代码的方法:

import AudioRecord from 'react-native-audio-record';
import * as RNFS from 'react-native-fs'
.....
record = () => {
    if (!this.state.recording) {
        this.setState({recording: true}, () => {
            AudioRecord.start()
        })
    } else {
        AudioRecord.stop().then(r => {
            this.setState({recording: false})

            RNFS.readFile(r, 'base64') // r is the path to the .wav file on the phone
                .then((data) => {

                    this.context.socket.emit('sendingAudio', {
                        sound: data
                    });
                })
        });


    }
}

我在实现中使用套接字,但是您几乎可以使用任何东西,因为我发送的只是一个长字符串。然后,在服务器端,我将字符串解码为:

export async function sendingAudio(data) {

  let fileName = `sound.wav`
  let buff = Buffer.from(data.sound, 'base64');
  await fs.writeFileSync(fileName, buff)
}

因此,基本上,我在电话上创建了一个wav文件,然后将其读取为base64编码,然后将其发送到服务器,然后在服务器上将其从base64解码为.wav文件。

对于Laravel,我相信这可以帮助您Decode base64 audio。只是不要将其另存为mp3,而是wav。