我尝试使用imagePickerController拍摄照片,并在另一个ViewController中的UIImageView中显示图像。
- (void)imagePickerController:(UIImagePickerController *)picker didFinishPickingMediaWithInfo:(NSDictionary *)info {
[picker dismissViewControllerAnimated:YES completion:NULL];
PAWWallPostCreateViewController *wallPostCreateViewController = [[PAWWallPostCreateViewController alloc] initWithNibName:@"PAWWallPostCreateViewController" bundle:nil];
wallPostCreateViewController.myImage = info[UIImagePickerControllerEditedImage];
wallPostCreateViewController.imageView = [[UIImageView alloc] initWithImage:wallPostCreateViewController.myImage];
wallPostCreateViewController.imageView.contentMode = UIViewContentModeScaleAspectFit;
NSLog(@"%@",wallPostCreateViewController.imageView); // Output below
[self.navigationController presentViewController:wallPostCreateViewController animated:YES completion:nil];
}
NSLog中imageView的输出是:
<UIImageView: 0x18f695b0; frame = (0 0; 640 640); opaque = NO; userInteractionEnabled = NO; layer = <CALayer: 0x18f14240>>
框架对于我的实际UIImageView来说太大了,它被设置为(60 227; 200 200)
我认为这就是我运行应用程序时根本无法看到UIImageView的原因。我该如何解决这个问题?
编辑:
在PAWWallPostCreateViewController.h
@property (nonatomic, strong) IBOutlet UIImageView *imageView;
我将imageView链接到我的.xib文件
熟悉autoresize = RM + BM的人也是;因为通常当我使用imagePickerController将UIImage放在同一个视图控制器的UIImageView中时,我的UIImageView数据有一个autoresize = RM + BM;属性 所以我想要的代码通常看起来像
<UIImageView: 0x1a2b4b30; frame = (60 227; 200 200); opaque = NO; autoresize = RM+BM; userInteractionEnabled = NO; layer = <CALayer: 0x1a2b4bb0>>
但现在我在这种情况下得到的是
答案 0 :(得分:5)
只需在ViewDidLoad()
中的第二个视图控制器下调用此行self.imageView.contentMode = UIViewContentModeScaleAspectFit;