Xamarin OpenEars原生绑定不能在设备上工作,但适用于模拟器

时间:2015-05-15 14:17:31

标签: xcode xamarin xamarin.ios openears

我一直在使用xamarin iOS Binding项目中的OpenEars v2.03 iOS框架项目。让我解释一下到目前为止我做了什么,。我是XCode,Xamarin以及所有这些Binding事件的新手。这将是一个很大的问题,所以屏住呼吸......

1)在Xcode for Simulator中构建OpenEars框架项目。 从 Framework / OpenEars.framework / Versions / Current / 复制“OpenEars”文件,并重命名为“ libOpenEars-i386.a

同样通过将设备连接到Mac并将目标选择到我的iPhone,为iPhone 4s设备构建相同的库。最后复制生成的OpenEars并将其重命名为“ libOpenEars-armv7.a

2)使用 lipo 命令使用以下命令将两个文件(libOpenEars-i386.a,libOpenEars-armv7.a)捆绑到单个文件“libOpenEars.a”。

lipo -create -output libOpenEars.a libOpenEars-i386.a libOpenEars-armv7.a 

3)在Xamarin Studio中创建了一个Binding项目并添加了libOpenEars.a,它会自动生成 libOpenEars.linkwith.cs 。以下是以下代码,

using System;
using ObjCRuntime;

[assembly: LinkWith ("libOpenEars.a", LinkTarget.ArmV7 | LinkTarget.Simulator, SmartLink = true, ForceLoad = true, Frameworks="AudioToolbox AVFoundation", IsCxx=true, LinkerFlags = "-lstdc++")]

我尝试更改liker标记LinkerFlags =" -lstdc ++ -lc ++ -ObjC“和SmartLink = false。

4)我的ApiDefinition文件包含OpenEars的所有界面,我在这里只添加了一个界面。

[BaseType(typeof(NSObject))]
[Protocol]
interface OEEventsObserver
{
    [Wrap ("WeakDelegate")]
    OEEventsObserverDelegate Delegate { get; set; }

    [Export ("delegate", ArgumentSemantic.Assign), NullAllowed]
    NSObject WeakDelegate { get; set; }
}

5)将OpenEars.dll引用到我的iOS示例项目中。

6)在Binding库本身添加语言模型和声学模型。 (尽管动态语言模型生成不需要,我使用了这个OpenEars Xamarin git的旧OpenEars示例项目,我没有使用新的DynamicLanguageModel生成器,但修改了最新更改的示例。)

查看控制器:

public partial class OpenEarsNewApiViewController : UIViewController
{
    OEEventsObserver observer;
    OEFliteController fliteController;
    OEPocketsphinxController pocketSphinxController;


    String pathToLanguageModel;
    String pathToDictionary;
    String pathToAcousticModel;

    String firstVoiceToUse;
    String secondVoiceToUse;

    static bool UserInterfaceIdiomIsPhone {
        get { return UIDevice.CurrentDevice.UserInterfaceIdiom == UIUserInterfaceIdiom.Phone; }
    }

    public void init()
    {
        try
        {
            observer = new OEEventsObserver();
            observer.Delegate = new OpenEarsEventsObserverDelegate (this);
            pocketSphinxController = new OEPocketsphinxController ();

            fliteController = new OEFliteController();

            firstVoiceToUse = "cmu_us_slt";
            secondVoiceToUse = "cmu_us_rms";

            pathToLanguageModel = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.languagemodel";
            pathToDictionary = NSBundle.MainBundle.ResourcePath + System.IO.Path.DirectorySeparatorChar + "OpenEars1.dic";
            pathToAcousticModel = NSBundle.MainBundle.ResourcePath;
        }
        catch(Exception e) {
            Console.WriteLine ("Exception Message :"+e.Message);
            Console.WriteLine ("Inner Exception Mesage :"+e.InnerException.Message);
        }

    }

    public OpenEarsNewApiViewController (IntPtr handle) : base (handle)
    {
        init ();
    }

    #region Update

    public void UpdateStatus (String text)
    {
        txtStatus.Text = text;
    }

    public void UpdateText (String text)
    {
        txtOutput.Text = text;
    }

    public void UpdateButtonStates (bool hidden1, bool hidden2, bool hidden3, bool hidden4)
    {
        btnStartListening.Hidden = hidden1;
        btnStopListening.Hidden = hidden2;
        btnSuspend.Hidden = hidden3;
        btnResume.Hidden = hidden4;
    }

    public void Say (String text)
    {
        //fliteController.SaywithVoice (text, secondVoiceToUse);
    }

    public void StartListening ()
    {
        //pocketSphinxController.RequestMicPermission ();
        if (!pocketSphinxController.IsListening) {

            //NSString *correctPathToMyLanguageModelFile = [NSString stringWithFormat:@"%@/TheNameIChoseForMyLanguageModelAndDictionaryFile.%@",[NSSearchPathForDirectoriesInDomains(NSCachesDirectory, NSUserDomainMask, YES) objectAtIndex:0],@"DMP"];


            pocketSphinxController.StartListeningWithLanguageModelAtPath (
                pathToLanguageModel,
                pathToDictionary,
                pathToAcousticModel,
                false
            );
        } else {
            new UIAlertView ("Notify !!","Already Listening",null,"OK","Stop").Show();

        }

    }

    public void StopListening ()
    {
        //pocketSphinxController.StopListening ();
    }

    public void SuspendRecognition ()
    {
        pocketSphinxController.SuspendRecognition ();
    }

    public void ResumeRecognition ()
    {
        pocketSphinxController.ResumeRecognition ();
    }

    #endregion

    #region Event Handlers

    partial void btnStartListening_TouchUpInside (UIButton sender)
    {
        try
        {
            StartListening();
            //fliteController.Init();
            //Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
            //fliteController.Say("Hai", new OEFliteVoice());

            UpdateButtonStates (true, false, false, true);
            Console.WriteLine("Speech in Progress :"+fliteController.SpeechInProgress);
        }
        catch(Exception e)
        {
            Console.WriteLine(e.Message);
        }
    }

    partial void btnStopListening_TouchUpInside (UIButton sender)
    {
        StopListening ();
        UpdateButtonStates (false, true, true, true);
    }

    partial void btnSuspend_TouchUpInside (UIButton sender)
    {
        SuspendRecognition ();
        UpdateButtonStates (true, false, true, false);
    }

    partial void btnResume_TouchUpInside (UIButton sender)
    {
        ResumeRecognition ();
        UpdateButtonStates (true, false, false, true);
    }
}

OpenEarsEventsObserverDelegate:

// nothing much here just to check the status and debugging 

public class OpenEarsEventsObserverDelegate:OEEventsObserverDelegate
{
    OpenEarsNewApiViewController _controller;

    public OpenEarsNewApiViewController controller {
        get {
            return _controller;
        }
        set {
            _controller = value;
        }
    }

    public OpenEarsEventsObserverDelegate (OpenEarsNewApiViewController ctrl)
    {
        controller = ctrl;
    }

    public override void PocketsphinxRecognitionLoopDidStart()
    {
        //base.PocketsphinxRecognitionLoopDidStart();

        Console.WriteLine ("Pocketsphinx is starting up");
        controller.UpdateStatus ("Pocketsphinx is starting up");
    }

    public override void PocketsphinxDidReceiveHypothesis (Foundation.NSString hypothesis, Foundation.NSString recognitionScore, Foundation.NSString utteranceID)
    {
        controller.UpdateText ("Heard: " + hypothesis);
        controller.Say ("You said: " + hypothesis);
    }

    public override void PocketSphinxContinuousSetupDidFail ()
    {

    }

    public override void PocketsphinxDidCompleteCalibration ()
    {
        Console.WriteLine ("Pocket calibration is complete");
        controller.UpdateStatus ("Pocket calibratio is complete");
    }

    public override void PocketsphinxDidDetectSpeech ()
    {

    }

    public override void PocketsphinxDidStartListening ()
    {
        Console.WriteLine ("Pocketsphinx is now listening");
        controller.UpdateStatus ("Pocketphinx is now listening");
        controller.UpdateButtonStates (true, false, false, true);
    }

    public override void PocketsphinxDidStopListening ()
    {

    }

    public override void PocketsphinxDidStartCalibration ()
    {
        Console.WriteLine ("Pocketsphinx calibration has started.");
        controller.UpdateStatus ("Pocketsphinx calibration has started");
    }

    public override void PocketsphinxDidResumeRecognition ()
    {

    }

    public override void PocketsphinxDidSuspendRecognition ()
    {

    }

    public override void PocketsphinxDidDetectFinishedSpeech ()
    {

    }

    public override void FliteDidStartSpeaking ()
    {

    }

    public override void FliteDidFinishSpeaking ()
    {

    }
}

这在iOS模拟器上运行良好,但不能在真实设备上运行。

Simulator screen shot.

我在设备上运行时收到此错误消息。我正在为所有接口收到相同的消息。

Exception Message :Wrapper type 'OpenEars.OEEventsObserver' is missing its native ObjectiveC class 'OEEventsObserver'.

2015-05-15 12:55:26.996 OpenEarsNewApi[1359:231264] Unhandled managed  exception: Exception has been thrown by the target of an invocation.  (System.Reflection.TargetInvocationException)
at System.Reflection.MonoCMethod.InternalInvoke (System.Object obj,   System.Object[] parameters) [0x00016] in   /Developer/MonoTouch/Source/mono/mcs/class/corlib/System.Reflection/MonoMethod.cs:543 

我是否遗漏了与设备绑定相关的任何内容?

我尝试使用make文件构建相同的.dll,但是得到了相同的错误消息。

用于构建OpenEars Framework:

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphonesimulator8.2 -arch i386 -configuration Release clean build

xcodebuild -project OpenEars.xcodeproj -target OpenEars -sdk iphoneos -arch armv7 -configuration Release clean build

为Genrating OpenEars.dll制作文件

BTOUCH=/Developer/MonoTouch/usr/bin/btouch-native

all: OpenEars.dll


OpenEars.dll: AssemblyInfo.cs OpenEars.cs libOpenEars.a
$(BTOUCH) -unsafe --new-style -out:$@ OpenEars.cs -x=AssemblyInfo.cs --link-with=libOpenEars.a,libOpenEars.a

clean:
   -rm -f *.dll

检查完整的mtouch error log here

$lipo -info libOpenEars.a

Architectures in the fat file: libOpenEars.a are: i386 armv7 

检查     $ nm -arch armv7 libOpenEars.a

nm command output here

检查了模拟器(i386)中存在的OEEvent

$ nm -arch i386 libOpenEars.a | grep OEEvent

输出

U _OBJC_CLASS_$_OEEventsObserver
00006aa0 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000076f0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning: /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00002174 S _OBJC_CLASS_$_OEEventsObserver
00002170 S _OBJC_IVAR_$_OEEventsObserver._delegate
00002188 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002d90 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000035a0 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

检查了armv7中存在的OEEvent

$nm -arch armv7 libOpenEars.a | grep OEEvent

输出

 U _OBJC_CLASS_$_OEEventsObserver
00005680 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
000062d8 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate
warning:    /Applications/Xcode.app/Contents/Developer/Toolchains/XcodeDefault.xctoolchain/usr/bin/nm: no name list
libOpenEars.a(OEEventsObserver.o):
00001cb4 S _OBJC_CLASS_$_OEEventsObserver
00001cb0 S _OBJC_IVAR_$_OEEventsObserver._delegate
00001cc8 S _OBJC_METACLASS_$_OEEventsObserver
     U _OBJC_CLASS_$_OEEventsObserver
00002638 S l_OBJC_LABEL_PROTOCOL_$_OEEventsObserverDelegate
00002e50 S l_OBJC_PROTOCOL_$_OEEventsObserverDelegate

我不确定我错过了什么。是的,有很多语法错误,我感谢你花时间阅读这篇文章。

1 个答案:

答案 0 :(得分:0)

感谢@poupou和@Halle的宝贵意见。最后,我使用包括arm64和x86_64(必须)在内的所有体系结构构建了fat二进制文件。使用lipo在一个包中构建。现在就像魅力一样!...还设置项目属性 - >高级 - > SupportedArchi。 - > ARMv7用于在ipad 2和iPhone 4等设备上运行。仍然需要在iPhone 6和6+中进行测试,我希望也可以支持,因为它们是arm64系列。我不确定它如何在ARMv7上运行(iPhone 5,iPhone 5c,iPad 4)。我没有在OpenEars v2.03中看到ARMv7的支持。