PocketSphinx android演示运行时异常

时间:2014-09-09 06:33:02

标签: java android offline voice-recognition pocketsphinx-android

我下载了口袋狮身人面像演示的源代码。我试图运行它,但它抛出了一个运行时异常。我发布了代码的logcat。

09-09 11:45:38.980:I / System.out(7912):发送WAIT块 09-09 11:45:38.980:W / ActivityThread(7912):应用程序edu.cmu.pocketsphinx.demo正在等待端口8100上的调试器......

09-09 11:45:39.030: I/dalvikvm(7912): Debugger is active
09-09 11:45:39.210: I/System.out(7912): Debugger has connected
09-09 11:45:39.210: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:39.400: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:39.600: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:39.810: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:40.000: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:40.210: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:40.400: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:40.600: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:40.810: I/System.out(7912): waiting for debugger to settle...
09-09 11:45:41.010: I/System.out(7912): debugger has settled (1359)
09-09 11:45:41.930: D/dalvikvm(7912): threadid=1: still suspended after undo (sc=1 dc=1)
09-09 11:45:48.960: I/dalvikvm(7912): threadid=4: reacting to signal 3
09-09 11:45:48.960: D/dalvikvm(7912): threadid=1: still suspended after undo (sc=1 dc=1)
09-09 11:45:48.960: I/dalvikvm(7912): Wrote stack traces to '/data/anr/traces.txt'
09-09 11:45:52.310: D/dalvikvm(7912): GC_EXTERNAL_ALLOC freed 44K, 49% free 2778K/5379K, external 0K/0K, paused 26ms
09-09 11:45:58.770: D/CLIPBOARD(7912): Hide Clipboard dialog at Starting input: finished by someone else... !
09-09 11:46:05.860: D/dalvikvm(7912): GC_CONCURRENT freed 366K, 50% free 2845K/5639K, external 7K/1286K, paused 2ms+3ms
09-09 11:46:05.870: I/Assets(7912): Skipping asset models/grammar/menu.gram: checksums are equal
09-09 11:46:05.870: I/Assets(7912): Skipping asset models/grammar/digits.gram: checksums are equal
09-09 11:46:05.880: I/Assets(7912): Skipping asset models/lm/3015.lm: checksums are equal
09-09 11:46:05.890: I/Assets(7912): Skipping asset models/hmm/en-us/noisedict: checksums are equal
09-09 11:46:05.900: I/Assets(7912): Skipping asset models/hmm/en-us/mixture_weights: checksums are equal
09-09 11:46:05.900: I/Assets(7912): Skipping asset models/hmm/en-us/means: checksums are equal
09-09 11:46:05.900: I/Assets(7912): Skipping asset models/hmm/en-us/variances: checksums are equal
09-09 11:46:05.900: I/Assets(7912): Skipping asset models/hmm/en-us/transition_matrices: checksums are equal
09-09 11:46:05.910: I/Assets(7912): Skipping asset models/dict/5497.dic: checksums are equal
09-09 11:46:05.910: I/Assets(7912): Skipping asset models/hmm/en-us/feature_transform: checksums are equal
09-09 11:46:05.910: I/Assets(7912): Skipping asset models/hmm/en-us/mdef: checksums are equal
09-09 11:46:05.910: I/Assets(7912): Skipping asset models/hmm/en-us/feat.params: checksums are equal
09-09 11:46:05.910: I/Assets(7912): Skipping asset models/hmm/en-us/README: checksums are equal
09-09 11:46:15.670: D/dalvikvm(7912): Trying to load lib /data/data/edu.cmu.pocketsphinx.demo/lib/libpocketsphinx_jni.so 0x4051cbd0
09-09 11:46:15.680: D/dalvikvm(7912): Added shared lib /data/data/edu.cmu.pocketsphinx.demo/lib/libpocketsphinx_jni.so 0x4051cbd0
09-09 11:46:15.680: D/dalvikvm(7912): No JNI_OnLoad found in /data/data/edu.cmu.pocketsphinx.demo/lib/libpocketsphinx_jni.so 0x4051cbd0, skipping init
09-09 11:46:15.680: I/cmusphinx(7912): INFO: cmd_ln.c(696): Parsing command line:
09-09 11:46:15.680: I/cmusphinx(7912): Current configuration:
09-09 11:46:31.640: I/cmusphinx(7912): INFO: cmd_ln.c(696): Parsing command line:
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -nfilt 
09-09 11:46:31.640: I/cmusphinx(7912): 25 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -lowerf 
09-09 11:46:31.640: I/cmusphinx(7912): 130 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -upperf 
09-09 11:46:31.640: I/cmusphinx(7912): 6800 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -transform 
09-09 11:46:31.640: I/cmusphinx(7912): dct 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -lifter 
09-09 11:46:31.640: I/cmusphinx(7912): 22 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -feat 
09-09 11:46:31.640: I/cmusphinx(7912): 1s_c_d_dd 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -agc 
09-09 11:46:31.640: I/cmusphinx(7912): none 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -cmn 
09-09 11:46:31.640: I/cmusphinx(7912): current 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -varnorm 
09-09 11:46:31.640: I/cmusphinx(7912): no 
09-09 11:46:31.640: I/cmusphinx(7912): \
09-09 11:46:31.640: I/cmusphinx(7912):  
09-09 11:46:31.640: I/cmusphinx(7912): -cmninit 
09-09 11:46:31.640: I/cmusphinx(7912): 40 
09-09 11:46:31.640: I/cmusphinx(7912): Current configuration:
09-09 11:46:31.640: I/cmusphinx(7912): INFO: acmod.c(251): Parsed model-specific feature parameters from /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/feat.params
09-09 11:46:31.640: I/cmusphinx(7912): INFO: feat.c(715): Initializing feature stream to type: '1s_c_d_dd', ceplen=13, CMN='current', VARNORM='no', AGC='none'
09-09 11:46:31.640: I/cmusphinx(7912): INFO: cmn.c(143): mean[0]= 12.00, mean[1..12]= 0.0
09-09 11:46:31.640: I/cmusphinx(7912): INFO: acmod.c(160): Reading linear feature transformation from /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/feature_transform
09-09 11:46:31.650: I/cmusphinx(7912): INFO: mdef.c(517): Reading model definition: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/mdef
09-09 11:46:33.430: I/cmusphinx(7912): INFO: bin_mdef.c(181): Allocating 173954 * 8 bytes (1359 KiB) for CD tree
09-09 11:46:33.570: I/cmusphinx(7912): INFO: tmat.c(206): Reading HMM transition probability matrices: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/transition_matrices
09-09 11:46:33.570: I/cmusphinx(7912): INFO: acmod.c(123): Attempting to use SCHMM computation module
09-09 11:46:33.570: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/means
09-09 11:46:35.110: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:35.110: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:35.110: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/variances
09-09 11:46:36.890: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:36.900: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:38.920: I/cmusphinx(7912): INFO: ms_gauden.c(354): 768 variance values floored
09-09 11:46:38.950: I/cmusphinx(7912): INFO: acmod.c(125): Attempting to use PTHMM computation module
09-09 11:46:38.960: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/means
09-09 11:46:40.140: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:40.140: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:40.140: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/variances
09-09 11:46:40.820: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:40.820: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:42.850: I/cmusphinx(7912): INFO: ms_gauden.c(354): 768 variance values floored
09-09 11:46:42.850: I/cmusphinx(7912): INFO: ptm_mgau.c(792): Number of codebooks exceeds 256: 6138
09-09 11:46:42.860: I/cmusphinx(7912): INFO: acmod.c(127): Falling back to general multi-stream GMM computation
09-09 11:46:42.860: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/means
09-09 11:46:43.280: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:43.280: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:43.280: I/cmusphinx(7912): INFO: ms_gauden.c(198): Reading mixture gaussian parameter: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/variances
09-09 11:46:43.480: I/cmusphinx(7912): INFO: ms_gauden.c(292): 6138 codebook, 1 feature, size: 
09-09 11:46:43.480: I/cmusphinx(7912): INFO: ms_gauden.c(294):  32x32
09-09 11:46:45.540: I/cmusphinx(7912): INFO: ms_gauden.c(354): 768 variance values floored
09-09 11:46:45.550: I/cmusphinx(7912): INFO: ms_senone.c(149): Reading senone mixture weights: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/mixture_weights
09-09 11:46:45.550: I/cmusphinx(7912): INFO: ms_senone.c(200): Truncating senone logs3(pdf) values by 10 bits
09-09 11:46:45.550: I/cmusphinx(7912): INFO: ms_senone.c(207): Not transposing mixture weights in memory
09-09 11:46:45.610: I/cmusphinx(7912): INFO: ms_senone.c(268): Read mixture weights for 6138 senones: 1 features x 32 codewords
09-09 11:46:45.610: I/cmusphinx(7912): INFO: ms_senone.c(320): Mapping senones to individual codebooks
09-09 11:46:45.610: I/cmusphinx(7912): INFO: ms_mgau.c(141): The value of topn: 4
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(320): Allocating 4181 * 20 bytes (81 KiB) for word entries
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(333): Reading main dictionary: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/dict/5497.dic
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(213): Allocated 0 KiB for strings, 0 KiB for phones
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(336): 76 words read
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(342): Reading filler dictionary: /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/hmm/en-us/noisedict
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(213): Allocated 0 KiB for strings, 0 KiB for phones
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict.c(345): 9 words read
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict2pid.c(396): Building PID tables for dictionary
09-09 11:46:45.620: I/cmusphinx(7912): INFO: dict2pid.c(406): Allocating 46^3 * 2 bytes (190 KiB) for word-initial triphones
09-09 11:46:45.650: I/cmusphinx(7912): INFO: dict2pid.c(132): Allocated 25576 bytes (24 KiB) for word-final triphones
09-09 11:46:45.650: I/cmusphinx(7912): INFO: dict2pid.c(196): Allocated 25576 bytes (24 KiB) for single-phone word triphones
09-09 11:46:59.750: I/cmusphinx(7912): INFO: kws_search.c(417): KWS(beam: -1080, plp: -23, default threshold -450)
09-09 11:46:59.750: E/cmusphinx(7912): ERROR: "kws_search.c", line 158: The word 'taking' is missing in the dictionary
09-09 11:47:07.390: I/SpeechRecognizer(7912): Load JSGF /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/grammar/menu.gram
09-09 11:47:07.400: I/cmusphinx(7912): INFO: jsgf.c(664): Defined rule: PUBLIC <menu.item>
09-09 11:47:07.400: I/cmusphinx(7912): INFO: fsg_model.c(215): Computing transitive closure for null transitions
09-09 11:47:07.400: I/cmusphinx(7912): INFO: fsg_model.c(277): 0 null transitions added
09-09 11:47:07.400: I/cmusphinx(7912): INFO: fsg_search.c(227): FSG(beam: -1080, pbeam: -1080, wbeam: -634; wip: -26, pip: 0)
09-09 11:47:07.400: E/cmusphinx(7912): ERROR: "fsg_search.c", line 142: The word 'forecast' is missing in the dictionary
09-09 11:47:26.240: I/SpeechRecognizer(7912): Load JSGF /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/grammar/digits.gram
09-09 11:47:26.240: I/cmusphinx(7912): INFO: jsgf.c(664): Defined rule: <digits.digit>
09-09 11:47:26.250: I/cmusphinx(7912): INFO: jsgf.c(664): Defined rule: <digits.g00001>
09-09 11:47:26.250: I/cmusphinx(7912): INFO: jsgf.c(664): Defined rule: PUBLIC <digits.digits>
09-09 11:47:26.250: I/cmusphinx(7912): INFO: jsgf.c(381): Right recursion <digits.g00001> 2 => 0
09-09 11:47:26.250: I/cmusphinx(7912): INFO: fsg_model.c(215): Computing transitive closure for null transitions
09-09 11:47:26.250: I/cmusphinx(7912): INFO: fsg_model.c(277): 0 null transitions added
09-09 11:47:26.250: I/cmusphinx(7912): INFO: fsg_search.c(227): FSG(beam: -1080, pbeam: -1080, wbeam: -634; wip: -26, pip: 0)
09-09 11:47:26.250: E/cmusphinx(7912): ERROR: "fsg_search.c", line 142: The word 'nine' is missing in the dictionary
09-09 11:47:36.880: I/SpeechRecognizer(7912): Load N-gram model /mnt/sdcard/Android/data/edu.cmu.pocketsphinx.demo/files/sync/models/lm/3015.lm
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(477): ngrams 1=58, 2=117, 3=140
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(135): Reading unigrams
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(516):       58 = #unigrams created
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(195): Reading bigrams
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(533):      117 = #bigrams created
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(534):       26 = #prob2 entries
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(542):       19 = #bo_wt2 entries
09-09 11:47:36.890: I/cmusphinx(7912): INFO: ngram_model_arpa.c(292): Reading trigrams
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_model_arpa.c(555):      140 = #trigrams created
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_model_arpa.c(556):       14 = #prob3 entries
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(99): 55 unique initial diphones
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(148): 0 root, 0 non-root channels, 11 single-phone words
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(186): Creating search tree
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(192): before: 0 root, 0 non-root channels, 11 single-phone words
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(326): after: max nonroot chan increased to 280
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdtree.c(339): after: 55 root, 152 non-root channels, 10 single-phone words
09-09 11:47:36.900: I/cmusphinx(7912): INFO: ngram_search_fwdflat.c(157): fwdflat: min_ef_width = 4, max_sf_win = 25
09-09 11:47:48.380: I/SpeechRecognizer(7912): Start recognition "leave"
09-09 11:49:41.350: W/dalvikvm(7912): threadid=1: thread exiting with uncaught exception (group=0x4001e578)
09-09 11:49:41.450: E/AndroidRuntime(7912): FATAL EXCEPTION: main
09-09 11:49:41.450: E/AndroidRuntime(7912): java.lang.RuntimeException: Decoder_setSearch returned -1
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.PocketSphinxJNI.Decoder_setSearch(Native Method)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.Decoder.setSearch(Unknown Source)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.SpeechRecognizer.startListening(Unknown Source)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.demo.PocketSphinxActivity.switchSearch(PocketSphinxActivity.java:143)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.demo.PocketSphinxActivity.access$1(PocketSphinxActivity.java:141)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.demo.PocketSphinxActivity$1.onPostExecute(PocketSphinxActivity.java:102)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at edu.cmu.pocketsphinx.demo.PocketSphinxActivity$1.onPostExecute(PocketSphinxActivity.java:1)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.os.AsyncTask.finish(AsyncTask.java:417)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.os.AsyncTask.access$300(AsyncTask.java:127)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.os.AsyncTask$InternalHandler.handleMessage(AsyncTask.java:429)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.os.Handler.dispatchMessage(Handler.java:99)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.os.Looper.loop(Looper.java:130)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at android.app.ActivityThread.main(ActivityThread.java:3770)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at java.lang.reflect.Method.invokeNative(Native Method)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at java.lang.reflect.Method.invoke(Method.java:507)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:912)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:670)
09-09 11:49:41.450: E/AndroidRuntime(7912):     at dalvik.system.NativeStart.main(Native Method)
09-09 11:49:43.580: I/dalvikvm(7912): threadid=4: reacting to signal 3
09-09 11:49:43.610: I/dalvikvm(7912): Wrote stack traces to '/data/anr/traces.txt'

1 个答案:

答案 0 :(得分:0)

您可以在日志中看到以下错误:

09-09 11:47:26.250: E/cmusphinx(7912): ERROR: "fsg_search.c", line 142: The word 'nine' is missing in the dictionary
09-09 11:46:59.750: E/cmusphinx(7912): ERROR: "kws_search.c", line 158: The word 'taking' is missing in the dictionary

您的词典看起来与您的更改不兼容。您需要确保字典包含所有必需的单词。

并且,我看到你做了一些改变,但大多数都是不完整的。例如,您评论了在setupRecognizer中设置命名搜索的代码,但未更改切换代码,因此它仍尝试切换到较旧的搜索。你可以试试这样的东西,它应该有效:

package edu.cmu.pocketsphinx.demo;

import static android.widget.Toast.makeText;
import static edu.cmu.pocketsphinx.SpeechRecognizerSetup.defaultSetup;

import java.io.File;
import java.io.IOException;
import java.util.HashMap;

import android.app.Activity;
import android.os.AsyncTask;
import android.os.Bundle;
import android.util.Log;
import android.widget.TextView;
import android.widget.Toast;
import edu.cmu.pocketsphinx.Assets;
import edu.cmu.pocketsphinx.Hypothesis;
import edu.cmu.pocketsphinx.RecognitionListener;
import edu.cmu.pocketsphinx.SpeechRecognizer;

public class PocketSphinxActivity extends Activity implements
        RecognitionListener {

    private static final String KWS_SEARCH = "KEYPHRASE";
    private static final String DICTATION_SEARCH = "DICTATION";

    private static final String KEYPHRASE = "MORNING";

    private SpeechRecognizer recognizer;
    private HashMap<String, Integer> captions;

    @Override
    public void onCreate(Bundle state) {
        super.onCreate(state);

        // Prepare the data for UI
        captions = new HashMap<String, Integer>();
        captions.put(KWS_SEARCH, R.string.kws_caption);
        captions.put(DICTATION_SEARCH, R.string.forecast_caption);
        setContentView(R.layout.main);
        ((TextView) findViewById(R.id.caption_text))
                .setText("Preparing the recognizer");

        // Recognizer initialization is a time-consuming and it involves IO,
        // so we execute it in async task

        new AsyncTask<Void, Void, Exception>() {
            @Override
            protected Exception doInBackground(Void... params) {
                try {
                    Assets assets = new Assets(PocketSphinxActivity.this);

                    File assetDir = assets.syncAssets();

                    setupRecognizer(assetDir);

                    recognizer.startListening(KWS_SEARCH);

                } catch (IOException e) {
                    return e;
                }
                return null;
            }

            @Override
            protected void onPostExecute(Exception result) {
                if (result != null) {
                    ((TextView) findViewById(R.id.caption_text))
                            .setText("Failed to init recognizer " + result);
                } else {
                    switchSearch(KWS_SEARCH);
                }
            }
        }.execute();
    }

    @Override
    public void onPartialResult(Hypothesis hypothesis) {
        String text = hypothesis.getHypstr();
        Log.d("Spoken text",text);

        if (text.equals(KEYPHRASE))
            switchSearch(DICTATION_SEARCH);

        else
            ((TextView) findViewById(R.id.result_text)).setText(text);
    }

    @Override
    public void onResult(Hypothesis hypothesis) {
        ((TextView) findViewById(R.id.result_text)).setText("");
        if (hypothesis != null) {
            String text = hypothesis.getHypstr();
            makeText(getApplicationContext(), text, Toast.LENGTH_SHORT).show();
        }
    }

    @Override
    public void onBeginningOfSpeech() {
    }

    @Override
    public void onEndOfSpeech() {
        Log.d("end","In end of speech");
        if (DICTATION_SEARCH.equals(recognizer.getSearchName()))
            switchSearch(KWS_SEARCH);
    }

    private void switchSearch(String searchName) {
        recognizer.stop();
        recognizer.startListening(searchName);
        String caption = getResources().getString(captions.get(searchName));
        ((TextView) findViewById(R.id.caption_text)).setText(caption);
    }

    private void setupRecognizer(File assetsDir) {
        File modelsDir = new File(assetsDir, "models");
        recognizer = defaultSetup()
                .setAcousticModel(new File(modelsDir, "hmm/en-us"))
                .setDictionary(new File(modelsDir, "dict/5497.dic"))
                .setRawLogDir(assetsDir).setKeywordThreshold(1e-20f)
                .setFloat("-beam", 1e-30f)
                .getRecognizer();
        recognizer.addListener(this);

        // Create keyword-activation search.
        recognizer.addKeyphraseSearch(KWS_SEARCH, KEYPHRASE);

        // Create language model search.
        File languageModel = new File(modelsDir, "lm/3015.lm");
        recognizer.addNgramSearch(DICTATION_SEARCH, languageModel);
    }
}