视图消失后AVAudioRecorder崩溃

时间:2014-01-20 13:10:55

标签: c# ios xamarin.ios crash avaudiorecorder

要录制音频我会将新视图显示为子视图,录制后我会使用RemoveFromSuperview删除其中的子视图。只是为了描述我如何处理叠加层。

问题是,在使用AVAudioRecorder录制后,应用程序有时会在删除叠加层视图后立即崩溃,有时甚至会在超级视图消失时崩溃。

崩溃日志如下所示:

Incident Identifier: 65BE4068-D7BE-4F3F-B739-00006CAB74CC
CrashReporter Key:   d97865bbdce37e60edb2f77974b8b86911a1007e
Hardware Model:      iPad4,1
Process:             B2MobileBjiOS [8764]
Path:                /var/mobile/Applications/3C397A5D-B00B-40CE-BCF8-0C6836B0EBB2/B2MobileBjiOS.app/B2MobileBjiOS
Identifier:          ch.bauplus.mobile.bj.trunk
Version:             30 (1.0)
Code Type:           ARM (Native)
Parent Process:      launchd [1]

Date/Time:           2014-01-20 13:54:04.160 +0100
OS Version:          iOS 7.0.4 (11B554a)
Report Version:      104

Exception Type:  EXC_BAD_ACCESS (SIGABRT)
Exception Subtype: KERN_INVALID_ADDRESS at 0x0000000000000000
Triggered by Thread:  0

Thread 0 Crashed:
0   libsystem_kernel.dylib          0x39f8a1fc __pthread_kill + 8
1   libsystem_pthread.dylib         0x39ff1a4f pthread_kill + 55
2   libsystem_c.dylib               0x39f3b029 abort + 73
3   B2MobileBjiOS                   0x01607d95 mono_handle_native_sigsegv (mini-exceptions.c:2335)
4   B2MobileBjiOS                   0x01612a99 mono_sigsegv_signal_handler (mini.c:6744)
5   libsystem_platform.dylib        0x39fec721 _sigtramp + 41
6   libAVFAudio.dylib               0x2e2db0b3 -[AVAudioRecorder dealloc] + 119
7   libAVFAudio.dylib               0x2e2db0b3 -[AVAudioRecorder dealloc] + 119
8   libobjc.A.dylib                 0x399e5b07 objc_object::sidetable_release(bool) + 171
9   B2MobileBjiOS                   0x016bfb00 monotouch_release_managed_ref (monotouch-glue.m:1296)
10  B2MobileBjiOS                   0x0109629c wrapper_managed_to_native_MonoTouch_Foundation_NSObject_monotouch_release_managed_ref_intptr + 88
11  B2MobileBjiOS                   0x00cb31d8 MonoTouch_Foundation_NSObject_ReleaseManagedRef + 28
12  B2MobileBjiOS                   0x00cb559c MonoTouch_Foundation_NSObject_NSObject_Disposer_Drain_MonoTouch_Foundation_NSObject + 360
13  B2MobileBjiOS                   0x006802cc wrapper_runtime_invoke_object_runtime_invoke_dynamic_intptr_intptr_intptr_intptr + 196
14  B2MobileBjiOS                   0x01614b75 mono_jit_runtime_invoke (mini.c:6610)
15  B2MobileBjiOS                   0x0165c77b mono_runtime_invoke (object.c:2827)
16  B2MobileBjiOS                   0x0158e7e7 native_to_managed_trampoline_MonoTouch_Foundation_NSObject_NSObject_Disposer_Drain (registrar.m:344)
17  Foundation                      0x2fd22e47 __NSThreadPerformPerform + 383
18  CoreFoundation                  0x2f309f1d __CFRUNLOOP_IS_CALLING_OUT_TO_A_SOURCE0_PERFORM_FUNCTION__ + 13
19  CoreFoundation                  0x2f309469 __CFRunLoopDoSources0 + 337
20  CoreFoundation                  0x2f307bd3 __CFRunLoopRun + 627
21  CoreFoundation                  0x2f27246d CFRunLoopRunSpecific + 521
22  CoreFoundation                  0x2f27224f CFRunLoopRunInMode + 103
23  GraphicsServices                0x33f732e7 GSEventRunModal + 135
24  UIKit                           0x31b27841 UIApplicationMain + 1133
25  B2MobileBjiOS                   0x010e1af8 wrapper_managed_to_native_MonoTouch_UIKit_UIApplication_UIApplicationMain_int_string___intptr_intptr + 268
26  B2MobileBjiOS                   0x00d617f4 MonoTouch_UIKit_UIApplication_Main_string___string_string + 296
27  B2MobileBjiOS                   0x002062f4 B2.Mobile.Bj.iOS.Application:Main (Main.cs:16)
28  B2MobileBjiOS                   0x006802cc wrapper_runtime_invoke_object_runtime_invoke_dynamic_intptr_intptr_intptr_intptr + 196
29  B2MobileBjiOS                   0x01614b75 mono_jit_runtime_invoke (mini.c:6610)
30  B2MobileBjiOS                   0x0165c77b mono_runtime_invoke (object.c:2827)
31  B2MobileBjiOS                   0x016604f9 mono_runtime_exec_main (object.c:4052)
32  B2MobileBjiOS                   0x01660349 mono_runtime_run_main (object.c:3678)
33  B2MobileBjiOS                   0x015fec5d mono_jit_exec (driver.g.c:1009)
34  B2MobileBjiOS                   0x016a95cc main (main.m:489)
35  libdyld.dylib                   0x39ed3ab5 start + 1

AudioRecordingOverlay.cs

using System;
using MonoTouch.UIKit;
using System.Diagnostics;
using System.Drawing;
using System.Timers;
using B2.Mobile.Common.iOS;

namespace B2.Mobile.Common.Ui.iOS
{
    public class AudioRecordingOverlay : UIView
    {
        private const int OverlaySize = 120;
        private const int ImageSize = 60;

        private readonly Timer _timer;
        private readonly Stopwatch _stopwatch;
        private readonly UITapGestureRecognizer _tapGesture;
        private readonly AudioHelper _audioRecorder;

        private UIView _overlay;
        private UIButton _button;
        private UILabel _label;
        private string _audioUrl;

        public Action<string> ClosedAction { get; set; }

        public AudioRecordingOverlay(string filename, RectangleF frame) : base(frame)
        {
            _stopwatch = new Stopwatch();
            _timer = new Timer(1000);
            _timer.AutoReset = true;
            _timer.Elapsed += HandleElapsed;
            _tapGesture = new UITapGestureRecognizer(TapGestureAction);
            _audioRecorder = new AudioHelper(filename);

            InitializeUI();
        }

        public void InitializeUI()
        {
            BackgroundColor = UIColor.FromRGBA(255, 255, 255, 200);
            AddGestureRecognizer(_tapGesture);

            var xOverlay = (Bounds.Width - OverlaySize) / 2;
            var yOverlay = (Bounds.Height - OverlaySize) / 2;
            _overlay = new UIView(new RectangleF(xOverlay, yOverlay, OverlaySize, OverlaySize));
            _overlay.BackgroundColor = UIColor.FromRGBA(255, 255, 255, 127);
            _overlay.Layer.CornerRadius = 5.0f;
            _overlay.Layer.BorderWidth = 0.5f;
            _overlay.Layer.BorderColor = UIColor.FromRGBA(0, 0, 0, 127).CGColor;
            AddSubview(_overlay);

            // Start / Stop / Status
            var xButton = (OverlaySize - ImageSize) / 2;
            var yButton = 20;
            _button = new UIButton(UIButtonType.RoundedRect);
            _button.Frame = new RectangleF(xButton, yButton, 60, 60);
            _button.SetImage(UIImage.FromBundle("Images/microphone-large.png"), UIControlState.Normal);
            _button.TouchUpInside += RecordButtonClicked;
            _overlay.AddSubview(_button);

            // Timer
            _label = new UILabel();
            _label.Frame = new RectangleF(20, ImageSize + 30, _overlay.Bounds.Width - 40, _label.Font.LineHeight);
            _label.TextAlignment = UITextAlignment.Center;
            _label.Text = _stopwatch.Elapsed.ToString("mm\\:ss");
            _overlay.AddSubview(_label);
        }

        public void Start()
        {
            if (_stopwatch.IsRunning)
                return;

            _stopwatch.Reset();
            _stopwatch.Start();
            _timer.Start();
            _button.TintColor = UIColor.Red;

            _audioRecorder.StartRecording();
        }

        public void Stop()
        {
            if (!_stopwatch.IsRunning)
                return;

            _stopwatch.Stop();
            _timer.Stop();
            _button.TintColor = TintColor;

            _audioRecorder.StopRecording();
        }

        public void Save()
        {
            _audioUrl = _audioRecorder.Save();
        }

        public void Discard()
        {
            _audioRecorder.Discard();
        }

        public void Close()
        {
            if (ClosedAction != null)
                ClosedAction(_audioUrl);

            RemoveFromSuperview();
        }

        void RecordButtonClicked(object sender, EventArgs eventArgs)
        {
            if (_stopwatch.IsRunning)
            {
                Stop();

                using (var alert = new UIAlertView("Aufnahme", "Soll die Aufnahme gespeichert werden?", null, "Nein", new[] { "Ja" }))
                {
                    alert.Clicked += (s, e) =>
                    {
                        if (e.ButtonIndex == 0)
                        {
                            Discard();
                        }
                        else
                        {
                            Save();
                        }

                        Close();
                    };

                    alert.Show();
                }
            }
            else
            {
                Start();
            }
        }

        private void HandleElapsed (object sender, ElapsedEventArgs e)
        {
            InvokeOnMainThread(() =>
                {
                    _label.Text = _stopwatch.Elapsed.ToString("mm\\:ss");
                });
        }

        private void TapGestureAction(UITapGestureRecognizer recognizer)
        {
            if (_stopwatch.IsRunning)
            {
                using (var alert = new UIAlertView("Aufnahme", "Soll die Aufnahme gestoppt und gespeichert werden?", null, "Nein", new[] { "Ja" }))
                {
                    alert.Clicked += (sender, e) =>
                    {
                        if (e.ButtonIndex != 0)
                        {
                            Stop();
                            Save();
                            Close();
                        }
                    };

                    alert.Show();
                }
            }
            else
            {
                RemoveFromSuperview();
            }
        }
    }
}

AudioHelper.cs

using System;
using MonoTouch.AVFoundation;
using MonoTouch.Foundation;
using System.IO;

namespace B2.Mobile.Common.iOS
{
    public class AudioHelper
    {
        private AVAudioRecorder _recorder;
        private readonly string _filename;
        private NSUrl _audioUrl;
        private AVAudioSession _session;
        private AVAudioPlayer _player;

        public Action FinishedPlayingAction {get;set;}

        public AudioHelper(string filename)
        {
            _filename = filename;
        }

        public void Play()
        {
            NSError error = null;
            _player = new AVAudioPlayer(new NSUrl(_filename), error);
            _player.PrepareToPlay();
            _player.FinishedPlaying += (sender, e) => 
            {
                if (FinishedPlayingAction != null)
                    FinishedPlayingAction();
            };

            _session = AVAudioSession.SharedInstance();
            _session.SetCategory(AVAudioSession.CategoryPlayback, out error);
            _session.SetActive(true, out error);

            _player.Play();
        }

        public void Continue()
        {
            if (_player != null)
                _player.Play();
        }

        public void Stop()
        {
            if (_player != null)
                _player.Stop();
        }

        public void Pause()
        {
            if (_player != null)
                _player.Pause();
        }

        public void StartRecording()
        {
            _recorder = new AVAudioRecorder();
            _session = AVAudioSession.SharedInstance();

            NSError error;
            _session.SetCategory(AVAudioSession.CategoryRecord, out error);
            if(error != null)
            {
                Console.WriteLine(error);
                return;
            }

            _session.SetActive(true, out error);
            if(error != null)
            {
                Console.WriteLine(error);
                return;
            }

            if(!PrepareAudioRecording())
            {
                throw new InvalidOperationException("Aufnahme fehlgeschlagen");
            }

            if(!_recorder.Record())
            {
                throw new InvalidOperationException("Aufnahme fehlgeschlagen");
            }
        }

        public void StopRecording()
        {
            if (!_recorder.Recording)
                return;

            _recorder.Stop();
        }

        public string Save()
        {
            return _audioUrl.Path;
        }

        public void Discard()
        {
            try
            {
                File.Delete(_audioUrl.Path);
            }
            catch 
            {
            }
        }

        private bool PrepareAudioRecording()
        {
            //Declare string for application temp path and tack on the file extension
            var tempRecording = Path.Combine(Environment.GetFolderPath(Environment.SpecialFolder.Personal), _filename);

            Console.WriteLine(tempRecording);
            _audioUrl = NSUrl.FromFilename(tempRecording);

            var audioSettings = new AudioSettings
            {
                SampleRate = 44100.0f, 
                Format = MonoTouch.AudioToolbox.AudioFormatType.MPEG4AAC,
                NumberChannels = 1,
                AudioQuality = AVAudioQuality.High
            };

            //Set recorder parameters
            NSError error;
            _recorder = AVAudioRecorder.Create(_audioUrl, audioSettings, out error);
            if((_recorder == null) || (error != null))
            {
                Console.WriteLine(error);
                return false;
            }

            //Set Recorder to Prepare To Record
            if(!_recorder.PrepareToRecord())
            {
                _recorder.Dispose();
                _recorder = null;
                return false;
            }

            _recorder.FinishedRecording += (sender, e) => 
            {
                _recorder.Dispose();
                _recorder = null;
                Console.WriteLine("Done Recording (status: {0})", e.Status);
            };

            return true;
        }
    }
}

2 个答案:

答案 0 :(得分:1)

问题出在AVAudioRecorder。如果使用对象的构造函数创建不带参数的AVAudioRecorder,则在对象释放时会崩溃。

您可以使用C#轻松重现崩溃:

var recorder = new AVAudioRecorder();
recorder.Dispose();

或使用Swift(即使在操场上):

var recorder: AVAudioRecorder? = AVAudioRecorder()
recorder = nil

在代码中,您使用_recorder初始化new AVAudioRecorder();,因此在进行垃圾回收时,您会在处理对象时遇到异常。

只需使用“正确的”构造函数创建记录器。

AVAudioRecorder.Create(NSUrl url, AudioSettings settings, out NSError error);

答案 1 :(得分:0)

在录音机仍在录制时,您可能正在关闭视图。添加对Close函数的停止调用(或使其成为Close),并且在记录器实际停止之前不要调用RemoveFromSuperview。覆盖Dispose函数以查看调用时记录器的状态。

这似乎也是多余的分配:

_recorder = new AVAudioRecorder();

你需要所有这些公共职能吗?保存,关闭,丢弃......?