相机功能不显示图像

时间:2011-12-26 11:42:11

标签: javascript android html5 camera

我有一个混合应用程序,它打开了android的原生相机。 我的代码:

public class CameraFunActivity extends Activity {  
    OpenCamera openCamera;  

   /** Called when the activity is first created. */
   @Override
    public void onCreate(Bundle savedInstanceState) {  
       super.onCreate(savedInstanceState);    
       setContentView(R.layout.main);    
     WebView webView=(WebView)findView  ById(R.id.webkitWebView1);  
     WebSettings settings = webView.getSettings();  
     settings.setJavaScriptEnabled(true);  
     settings.setDatabaseEnabled(true); 
      openCamera=new OpenCamera(webView,CameraFunActivity.this);  
         webView.addJavascriptInterface(openCamera,"camera");

    }  
   @Override   
     protected void onActivityResult(int requestCode, int resultCode, Intent data) {

     if (requestCode== 0 && resultCode == Activity.RESULT_OK){ 
         System.out.println("!!!!!!!!!!!!!!!!!!!!Camera Working...........");
         String imagePath =  "file:/"+openCamera.getPath();
        System.out.println("Image Pathhhhhhhhhh :::::::::::: " + imagePath);
        openCamera.setPath(imagePath); 

.. }

HTML:          function captureImage1(){      camera.startCamera();      var path =“file://”+ camera.getPath();      //警报(路径);      。的document.getElementById( “图像1”)SRC =路径;     }  

               

公共OpenCamera(WebView appView,活动上下文){
        this.mAppView = appView;
        this.context =上下文;     }

public void setPath(String path){

    _path=path;
}
public String getPath(){

    return _path;
}
public void setBitmap(Bitmap bitmap){
    System.out.println("setting bitmap");
    this.bitmap=bitmap;
}
public Bitmap getBitmap(){
    System.out.println("getting bitmap");
    return bitmap;
}

public void startCamera(){
    /*
    Camera camera = Camera.open();
    Camera.Parameters parameters = camera.getParameters();
    parameters.setPictureFormat(PixelFormat.JPEG);
    camera.setParameters(parameters);*/

    Date dt = new Date();   
    int date=dt.getDate();
    int hours = dt.getHours();   
    int minutes = dt.getMinutes(); 
    int seconds = dt.getSeconds();   
    String curTime = date+"_"+hours + "_"+minutes + "_"+ seconds;
    _path=Environment.getExternalStorageDirectory() +"/"+curTime+".jpg";
    File file = new File( _path );
    Uri outputFileUri = Uri.fromFile( file );
    Intent intent = new Intent(android.provider.MediaStore.ACTION_IMAGE_CAPTURE );
    intent.putExtra( MediaStore.EXTRA_OUTPUT, outputFileUri );





//  System.out.println("Paramssssssssssssss  " + camera.getParameters().toString());

    context.startActivityForResult(intent,0);

有时它会在img tag src中显示图像,有时则不会。需要帮助来找出问题所在。

此外,它是否与方向改变有关?

谢谢

1 个答案:

答案 0 :(得分:0)

我用这种方式解决了这个问题:

添加:
    function openCamera1(){
          result = camera.startCameraActivity1();
          PATH1 = camera.getImagePath1();
          路径= “文件://” + PATH1;           做{
                     fileIndicator = camera.findEOF();
                 }而(!fileIndicator)
                    的document.getElementById( “图像1”)SRC =路径;
           }

在OpenCamera中添加一个方法:

public boolean findEOF(){  
       File file=new File(imagePath1);  
       System.out.println("Inisde EOFL::::::::::::::"+file.length());  

        if(file.length()>0){ 
           System.out.println("Inisde length is::::::::::::::"+file.length());  
            return true;
      }         
       return false;

   }  

所以问题是在将图像写入文件之前调用了imagePath()