如何在Android Studio中使用本机C库

时间:2015-02-27 03:13:01

标签: java android android-studio android-ndk ffmpeg

几年前我根据https://ikaruga2.wordpress.com/2011/06/15/video-live-wallpaper-part-1/创建了一个问题。我的项目是在当时由Google直接提供的Eclipse版本中构建的,并且使用我的应用程序名称创建的已编译ffmpeg库的副本工作正常。

现在我正在尝试根据我的旧应用创建一个新应用。由于Google不再支持Eclipse,我下载了Android Studio并导入了我的项目。通过一些调整,我能够成功编译旧版本的项目。所以我修改了名称,复制了一组新的" .so"文件到app \ src \ main \ jniLibs \ armeabi(我认为他们应该去的地方)并尝试再次在我的手机上运行应用程序,绝对没有其他更改。

NDK不会抛出任何错误。 Gradle编译文件时没有错误,并将其安装在我的手机上。该应用程序出现在我的动态壁纸列表中,我可以单击它以显示预览。但是我收到的不是视频,而是错误和logCat报告:

02-26 21:50:31.164  18757-18757/? E/AndroidRuntime﹕ FATAL EXCEPTION: main
java.lang.ExceptionInInitializerError
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        at android.app.ActivityThread.access$1600(ActivityThread.java:127)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loop(Looper.java:137)
        at android.app.ActivityThread.main(ActivityThread.java:4441)
        at java.lang.reflect.Method.invokeNative(Native Method)
        at java.lang.reflect.Method.invoke(Method.java:511)
        at com.android.internal.os.ZygoteInit$MethodAndArgsCaller.run(ZygoteInit.java:823)
        at com.android.internal.os.ZygoteInit.main(ZygoteInit.java:590)
        at dalvik.system.NativeStart.main(Native Method)
 Caused by: java.lang.UnsatisfiedLinkError: Cannot load library: link_image[1936]:   144 could not load needed library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' for 'libavcore.so' (load_library[1091]: Library '/data/data/com.nightscapecreations.anim1free/lib/libavutil.so' not found)
        at java.lang.Runtime.loadLibrary(Runtime.java:370)
        at java.lang.System.loadLibrary(System.java:535)
        at com.nightscapecreations.anim3free.NativeCalls.<clinit>(NativeCalls.java:64)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onSharedPreferenceChanged(VideoLiveWallpaper.java:165)
        at com.nightscapecreations.anim3free.VideoLiveWallpaper.onCreate(VideoLiveWallpaper.java:81)
        at android.app.ActivityThread.handleCreateService(ActivityThread.java:2273)
        at android.app.ActivityThread.access$1600(ActivityThread.java:127)
        at android.app.ActivityThread$H.handleMessage(ActivityThread.java:1212)
        at android.os.Handler.dispatchMessage(Handler.java:99)
        at android.os.Looper.loop(Looper.java:137)
        at android.app.ActivityThread.main(ActivityThread.java:4441)
        at java.lang.reflect.Method.invokeNative(Native Method)
        at java.lang.reflect.Method.invoke(Method.java:511)

我是新手Android / Java / C ++开发人员,我不确定这个错误意味着什么,但Google让我相信我的新库没有找到。在我的Eclipse项目中,我在&#34; libs \ armeabi&#34;中有这组库,并在&#34; jni \ ffmpeg-android \ build \ ffmpeg \ armeabi \中的更复杂的文件夹结构中有另一个副本。 LIB&#34 ;.除了重命名&#34; libs&#34; Android Studio似乎保持一切都一样。到了&#34; jniLibs&#34;,但我遇到了这个错误的砖墙,我不确定如何继续。

如何使用Android Studio使用新名称编译此新应用程序?

如果它有帮助,这是我的Android.mk文件:

    LOCAL_PATH := $(call my-dir)

    include $(CLEAR_VARS)
    MY_LIB_PATH := ffmpeg-android/build/ffmpeg/armeabi/lib
    LOCAL_MODULE := bambuser-libavcore
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcore.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavformat
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavformat.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavcodec
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavcodec.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavfilter
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavfilter.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libavutil
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libavutil.so
    include $(PREBUILT_SHARED_LIBRARY)

    include $(CLEAR_VARS)
    LOCAL_MODULE := bambuser-libswscale
    LOCAL_SRC_FILES := $(MY_LIB_PATH)/libswscale.so
    include $(PREBUILT_SHARED_LIBRARY)

    #local_PATH := $(call my-dir)

    include $(CLEAR_VARS)

    LOCAL_CFLAGS := -DANDROID_NDK \
                    -DDISABLE_IMPORTGL

    LOCAL_MODULE    := video
    LOCAL_SRC_FILES := video.c

    LOCAL_C_INCLUDES := \
        $(LOCAL_PATH)/include \
        $(LOCAL_PATH)/ffmpeg-android/ffmpeg \
        $(LOCAL_PATH)/freetype/include/freetype2 \
        $(LOCAL_PATH)/freetype/include \
        $(LOCAL_PATH)/ftgl/src \
        $(LOCAL_PATH)/ftgl
    LOCAL_LDLIBS := -L$(NDK_PLATFORMS_ROOT)/$(TARGET_PLATFORM)/arch-arm/usr/lib -L$(LOCAL_PATH) -L$(LOCAL_PATH)/ffmpeg-android/build/ffmpeg/armeabi/lib/ -lGLESv1_CM -ldl -lavformat -lavcodec -lavfilter -lavutil -lswscale -llog -lz -lm

    include $(BUILD_SHARED_LIBRARY)

这是我的NativeCalls.java:

    package com.nightscapecreations.anim3free;

    public class NativeCalls {
        //ffmpeg
        public static native void initVideo();
        public static native void loadVideo(String fileName); //
        public static native void prepareStorageFrame();
        public static native void getFrame(); //
        public static native void freeConversionStorage();
        public static native void closeVideo();//
        public static native void freeVideo();//
        //opengl
        public static native void initPreOpenGL(); //
        public static native void initOpenGL(); //
        public static native void drawFrame(); //
        public static native void closeOpenGL(); //
        public static native void closePostOpenGL();//
        //wallpaper
        public static native void updateVideoPosition();
        public static native void setSpanVideo(boolean b);
        //getters
        public static native int getVideoHeight();
        public static native int getVideoWidth();
        //setters
        public static native void setWallVideoDimensions(int w,int h);
        public static native void setWallDimensions(int w,int h);
        public static native void setScreenPadding(int w,int h);
        public static native void setVideoMargins(int w,int h);
        public static native void setDrawDimensions(int drawWidth,int drawHeight);
        public static native void setOffsets(int x,int y);
        public static native void setSteps(int xs,int ys);
        public static native void setScreenDimensions(int w, int h);
        public static native void setTextureDimensions(int tx,
                               int ty );
        public static native void setOrientation(boolean b);
        public static native void setPreviewMode(boolean b);
        public static native void setTonality(int t);
        public static native void toggleGetFrame(boolean b);
        //fps
        public static native void setLoopVideo(boolean b);

        static {
        System.loadLibrary("avcore");
        System.loadLibrary("avformat");
        System.loadLibrary("avcodec");
        //System.loadLibrary("avdevice");
        System.loadLibrary("avfilter");
        System.loadLibrary("avutil");
        System.loadLibrary("swscale");
        System.loadLibrary("video");
        }

    }

修改

这是我的video.c文件的第一部分:

    #include <GLES/gl.h>
    #include <GLES/glext.h>

    #include <GLES2/gl2.h>
    #include <GLES2/gl2ext.h>

    #include <stdlib.h>
    #include <time.h>

    #include <libavcodec/avcodec.h>
    #include <libavformat/avformat.h>
    #include <libswscale/swscale.h>

    #include <jni.h>  
    #include <string.h>  
    #include <stdio.h>
    #include <android/log.h>

    //#include <FTGL/ftgl.h>

    //ffmpeg video variables
    int      initializedVideo=0;
    int      initializedFrame=0;
    AVFormatContext *pFormatCtx=NULL;
    int             videoStream;
    AVCodecContext  *pCodecCtx=NULL;
    AVCodec         *pCodec=NULL;
    AVFrame         *pFrame=NULL;
    AVPacket        packet;
    int             frameFinished;
    float           aspect_ratio;

    //ffmpeg video conversion variables
    AVFrame         *pFrameConverted=NULL;
    int             numBytes;
    uint8_t         *bufferConverted=NULL;

    //opengl
    int textureFormat=PIX_FMT_RGBA; // PIX_FMT_RGBA   PIX_FMT_RGB24
    int GL_colorFormat=GL_RGBA; // Must match the colorspace specified for textureFormat
    int textureWidth=256;
    int textureHeight=256;
    int nTextureHeight=-256;
    int textureL=0, textureR=0, textureW=0;
    int frameTonality;

    //GLuint textureConverted=0;
    GLuint texturesConverted[2] = { 0,1 };
    GLuint dummyTex = 2;
    static int len=0;


    static const char* BWVertexSrc =
             "attribute vec4 InVertex;\n"
             "attribute vec2 InTexCoord0;\n"
             "attribute vec2 InTexCoord1;\n"
             "uniform mat4 ProjectionModelviewMatrix;\n"
             "varying vec2 TexCoord0;\n"
             "varying vec2 TexCoord1;\n"

             "void main()\n"
             "{\n"
             "  gl_Position = ProjectionModelviewMatrix * InVertex;\n"
             "  TexCoord0 = InTexCoord0;\n"
             "  TexCoord1 = InTexCoord1;\n"
             "}\n";
    static const char* BWFragmentSrc  =

             "#version 110\n"
             "uniform sampler2D Texture0;\n"
             "uniform sampler2D Texture1;\n"

             "varying vec2 TexCoord0;\n"
             "varying vec2 TexCoord1;\n"

             "void main()\n"
             "{\n"
            "   vec3 color = texture2D(m_Texture, texCoord).rgb;\n"
            "   float gray = (color.r + color.g + color.b) / 3.0;\n"
            "   vec3 grayscale = vec3(gray);\n"

            "   gl_FragColor = vec4(grayscale, 1.0);\n"
             "}";
    static GLuint shaderProgram;


    //// Create a pixmap font from a TrueType file.
    //FTGLPixmapFont font("/home/user/Arial.ttf");
    //// Set the font size and render a small text.
    //font.FaceSize(72);
    //font.Render("Hello World!");

    //screen dimensions
    int screenWidth = 50;
    int screenHeight= 50;
    int screenL=0, screenR=0, screenW=0;
    int dPaddingX=0,dPaddingY=0;
    int drawWidth=50,drawHeight=50;

    //wallpaper
    int wallWidth = 50;
    int wallHeight = 50;
    int xOffSet, yOffSet;
    int xStep, yStep;
    jboolean spanVideo = JNI_TRUE;

    //video dimensions
    int wallVideoWidth = 0;
    int wallVideoHeight = 0;
    int marginX, marginY;
    jboolean isScreenPortrait = JNI_TRUE;
    jboolean isPreview = JNI_TRUE;
    jboolean loopVideo = JNI_TRUE;
    jboolean isGetFrame = JNI_TRUE;

    //file
    const char * szFileName;

    #define max( a, b ) ( ((a) > (b)) ? (a) : (b) )
    #define min( a, b ) ( ((a) < (b)) ? (a) : (b) )

    //test variables
    #define RGBA8(r, g, b)  (((r) << (24)) | ((g) << (16)) | ((b) << (8)) | 255)
    int sPixelsInited=JNI_FALSE;
    uint32_t *s_pixels=NULL;

    int s_pixels_size() { 
      return (sizeof(uint32_t) * textureWidth * textureHeight * 5); 
    }

    void render_pixels1(uint32_t *pixels, uint32_t c) {
        int x, y;
        /* fill in a square of 5 x 5 at s_x, s_y */
        for (y = 0; y < textureHeight; y++) {
            for (x = 0; x < textureWidth; x++) {
                int idx = x + y * textureWidth;
                pixels[idx++] = RGBA8(255, 255, 0);
            }
        }
    }

    void render_pixels2(uint32_t *pixels, uint32_t c) {
        int x, y;
        /* fill in a square of 5 x 5 at s_x, s_y */
        for (y = 0; y < textureHeight; y++) {
            for (x = 0; x < textureWidth; x++) {
                int idx = x + y * textureWidth;
                pixels[idx++] = RGBA8(0, 0, 255);
            }
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_initVideo (JNIEnv * env, jobject this) {
        initializedVideo = 0;
        initializedFrame = 0;
    }

    /* list of things that get loaded: */
    /* buffer */
    /* pFrameConverted */
    /* pFrame */
    /* pCodecCtx */
    /* pFormatCtx */
    void Java_com_nightscapecreations_anim3free_NativeCalls_loadVideo (JNIEnv * env, jobject this, jstring fileName)  {
        jboolean isCopy;
        szFileName = (*env)->GetStringUTFChars(env, fileName, &isCopy);
        //debug
        __android_log_print(ANDROID_LOG_DEBUG, "NDK: ", "NDK:LC: [%s]", szFileName);
        // Register all formats and codecs
        av_register_all();
        // Open video file
        if(av_open_input_file(&pFormatCtx, szFileName, NULL, 0, NULL)!=0) {
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't open file");
        return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Succesfully loaded file");
        // Retrieve stream information */
        if(av_find_stream_info(pFormatCtx)<0) {
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Couldn't find stream information");
        return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found stream info");
        // Find the first video stream
        videoStream=-1;
        int i;
        for(i=0; i<pFormatCtx->nb_streams; i++)
            if(pFormatCtx->streams[i]->codec->codec_type==CODEC_TYPE_VIDEO) {
                videoStream=i;
                break;
            }
        if(videoStream==-1) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Didn't find a video stream");
            return;
        }
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Found video stream");
        // Get a pointer to the codec contetx for the video stream
        pCodecCtx=pFormatCtx->streams[videoStream]->codec;
        // Find the decoder for the video stream
        pCodec=avcodec_find_decoder(pCodecCtx->codec_id);
        if(pCodec==NULL) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Unsupported codec");
            return;
        }
        // Open codec
        if(avcodec_open(pCodecCtx, pCodec)<0) {
            __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Could not open codec");
            return;
        }
        // Allocate video frame (decoded pre-conversion frame)
        pFrame=avcodec_alloc_frame();
        // keep track of initialization
        initializedVideo = 1;
        __android_log_print(ANDROID_LOG_DEBUG, "video.c", "NDK: Finished loading video");
    }

    //for this to work, you need to set the scaled video dimensions first
    void Java_com_nightscapecreations_anim3free_NativeCalls_prepareStorageFrame (JNIEnv * env, jobject this)  {
        // Allocate an AVFrame structure
        pFrameConverted=avcodec_alloc_frame();
        // Determine required buffer size and allocate buffer
        numBytes=avpicture_get_size(textureFormat, textureWidth, textureHeight);
        bufferConverted=(uint8_t *)av_malloc(numBytes*sizeof(uint8_t));
        if ( pFrameConverted == NULL || bufferConverted == NULL )
            __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Out of memory");
        // Assign appropriate parts of buffer to image planes in pFrameRGB
        // Note that pFrameRGB is an AVFrame, but AVFrame is a superset
        // of AVPicture
        avpicture_fill((AVPicture *)pFrameConverted, bufferConverted, textureFormat, textureWidth, textureHeight);
        __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "Created frame");
        __android_log_print(ANDROID_LOG_DEBUG, "prepareStorage>>>>", "texture dimensions: %dx%d", textureWidth, textureHeight);
        initializedFrame = 1;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoWidth (JNIEnv * env, jobject this)  {
        return pCodecCtx->width;
    }

    jint Java_com_nightscapecreations_anim3free_NativeCalls_getVideoHeight (JNIEnv * env, jobject this)  {
        return pCodecCtx->height;
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_getFrame (JNIEnv * env, jobject this)  {
        // keep reading packets until we hit the end or find a video packet
        while(av_read_frame(pFormatCtx, &packet)>=0) {
            static struct SwsContext *img_convert_ctx;
            // Is this a packet from the video stream?
            if(packet.stream_index==videoStream) {
                // Decode video frame
                /* __android_log_print(ANDROID_LOG_DEBUG,  */
                /*            "video.c",  */
                /*            "getFrame: Try to decode frame" */
                /*            ); */
                avcodec_decode_video(pCodecCtx, pFrame, &frameFinished, packet.data, packet.size);
                // Did we get a video frame?
                if(frameFinished) {
                    if(img_convert_ctx == NULL) {
                        /* get/set the scaling context */
                        int w = pCodecCtx->width;
                        int h = pCodecCtx->height;
                        img_convert_ctx = sws_getContext(w, h, pCodecCtx->pix_fmt, textureWidth,textureHeight, textureFormat, SWS_FAST_BILINEAR, NULL, NULL, NULL);
                        if(img_convert_ctx == NULL) {
                            return;
                        }
                    }
                    /* if img convert null */
                    /* finally scale the image */
                    /* __android_log_print(ANDROID_LOG_DEBUG,  */
                    /*          "video.c",  */
                    /*          "getFrame: Try to scale the image" */
                    /*          ); */

                    //pFrameConverted = pFrame;
                    sws_scale(img_convert_ctx, pFrame->data, pFrame->linesize, 0, pCodecCtx->height, pFrameConverted->data, pFrameConverted->linesize);
                    //av_picture_crop(pFrameConverted->data, pFrame->data, 1, pCodecCtx->height, pCodecCtx->width);
                    //av_picture_crop();
                    //avfilter_vf_crop();

                    /* do something with pFrameConverted */
                    /* ... see drawFrame() */
                    /* We found a video frame, did something with it, now free up
                       packet and return */
                    av_free_packet(&packet);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.age: %d", pFrame->age);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.buffer_hints: %d", pFrame->buffer_hints);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.display_picture_number: %d", pFrame->display_picture_number);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.hwaccel_picture_private: %d", pFrame->hwaccel_picture_private);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.key_frame: %d", pFrame->key_frame);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.palette_has_changed: %d", pFrame->palette_has_changed);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.pict_type: %d", pFrame->pict_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrame.qscale_type: %d", pFrame->qscale_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.age: %d", pFrameConverted->age);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.buffer_hints: %d", pFrameConverted->buffer_hints);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.display_picture_number: %d", pFrameConverted->display_picture_number);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.hwaccel_picture_private: %d", pFrameConverted->hwaccel_picture_private);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.key_frame: %d", pFrameConverted->key_frame);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.palette_has_changed: %d", pFrameConverted->palette_has_changed);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.pict_type: %d", pFrameConverted->pict_type);
    //              __android_log_print(ANDROID_LOG_INFO, "Droid Debug", "pFrameConverted.qscale_type: %d", pFrameConverted->qscale_type);
                    return;
                } /* if frame finished */
            } /* if packet video stream */
            // Free the packet that was allocated by av_read_frame
            av_free_packet(&packet);
        } /* while */
        //reload video when you get to the end
        av_seek_frame(pFormatCtx,videoStream,0,AVSEEK_FLAG_ANY);
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_setLoopVideo (JNIEnv * env, jobject this, jboolean b) {
        loopVideo = b;
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_closeVideo (JNIEnv * env, jobject this) {
        if ( initializedFrame == 1 ) {
            // Free the converted image
            av_free(bufferConverted);
            av_free(pFrameConverted);
            initializedFrame = 0;
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed converted image");
        }
        if ( initializedVideo == 1 ) {
            /* // Free the YUV frame */
            av_free(pFrame);
            /* // Close the codec */
            avcodec_close(pCodecCtx);
            // Close the video file
            av_close_input_file(pFormatCtx);
            initializedVideo = 0;
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeVideo (JNIEnv * env, jobject this) {
        if ( initializedVideo == 1 ) {
            /* // Free the YUV frame */
            av_free(pFrame);
            /* // Close the codec */
            avcodec_close(pCodecCtx);
            // Close the video file
            av_close_input_file(pFormatCtx);
            __android_log_print(ANDROID_LOG_DEBUG, "closeVideo>>>>", "Freed video structures");
            initializedVideo = 0;
        }
    }

    void Java_com_nightscapecreations_anim3free_NativeCalls_freeConversionStorage (JNIEnv * env, jobject this) {
        if ( initializedFrame == 1 ) {
            // Free the converted image
            av_free(bufferConverted);
            av_freep(pFrameConverted);
            initializedFrame = 0;
        }
    }

    /*--- END OF VIDEO ----*/

    /* disable these capabilities. */
    static GLuint s_disable_options[] = {
        GL_FOG,
        GL_LIGHTING,
        GL_CULL_FACE,
        GL_ALPHA_TEST,
        GL_BLEND,
        GL_COLOR_LOGIC_OP,
        GL_DITHER,
        GL_STENCIL_TEST,
        GL_DEPTH_TEST,
        GL_COLOR_MATERIAL,
        0
    };

    // For stuff that opengl needs to work with,
    // like the bitmap containing the texture
    void Java_com_nightscapecreations_anim3free_NativeCalls_initPreOpenGL (JNIEnv * env, jobject this)  {

    }
    ...

1 个答案:

答案 0 :(得分:4)

如果您只想重复使用以前的lib并且不使用NDK编译任何内容,则只需将所有.so文件放在jniLibs/<abi>中。

否则,由于您的ndk构建依赖于预构建,因此您无法正确配置它以直接使用gradle配置(ndk{})。无论如何,由于ndk支持目前已被弃用,最简洁的方法是使gradle调用ndk-build并使用现有的Makefile:

import org.apache.tools.ant.taskdefs.condition.Os

...

android {  
  ...
  sourceSets.main {
        jniLibs.srcDir 'src/main/libs' //set .so files location to libs instead of jniLibs
        jni.srcDirs = [] //disable automatic ndk-build call
    }

    // add a task that calls regular ndk-build(.cmd) script from app directory
    task ndkBuild(type: Exec) {
        if (Os.isFamily(Os.FAMILY_WINDOWS)) {
            commandLine 'ndk-build.cmd', '-C', file('src/main').absolutePath
        } else {
            commandLine 'ndk-build', '-C', file('src/main').absolutePath
        }
    }

    // add this task as a dependency of Java compilation
    tasks.withType(JavaCompile) {
        compileTask -> compileTask.dependsOn ndkBuild
    }
}