我非常需要此方面的帮助。我有一个个人项目来制作一个Web应用程序,该应用程序将允许用户使用其本机摄像头进行自拍照,该摄像头在加载后将显示现有的剪裁图像,并在用户的手机摄像头视图上方放置透明背景。
活动部分似乎工作正常,除了我不能使图像位于屏幕的最右下方。 <Photo 1-Portrait Mode> <Photo 2-Landscape Mode>我已经尝试过下面的代码,但是我的图像始终附着在我想要的位置上。
#camera--filter{
height: 75%;
width: 75%;
position: fixed;
right: 0;
bottom: 0;
object-fit: contain;
}
所以最后,我最终使用了“ position:absolute”,但如所附照片所示,图像叠加层并没有真正完全到达屏幕的右侧。
当单击<Photo 3 - No overlay image when captured>时,我似乎也无法使叠加图像显示在捕获的图像弹出窗口上。寻求任何愿意协助的人。请在下面查看我的代码。您可以在GitHub <here>上找到我的原型。
更重要的是,我仍然不知道如何将捕获的图像与叠加层保存在用户的电话相册中,或者让他们与其社交媒体帐户在线共享。
我在2年前发现了类似的post,并且得到了用户@Kaiido的回答,但我真的不知道如何将它们与我的代码正确合并。我真的很感谢大家的协助和回应。预先感谢!
<html lang=”en”>
<head>
<meta charset="utf-8">
<meta http-equiv="x-ua-compatible" content="ie=edge">
<meta name="viewport" content="width=device-width, initial-scale=1">
<!-- Name of project -->
<title>Ricci Selfie App</title>
<!-- Link to your main style sheet-->
<link rel="stylesheet" href="style.css">
</head>
<body>
<!-- Camera -->
<main id="camera">
<!-- Camera sensor -->
<canvas id="camera--sensor"></canvas>
<!-- Camera view -->
<video id="camera--view" autoplay playsinline></video>
<!-- Camera output -->
<img src="//:0" alt="" id="camera--output">
<!--Camera filter -->
<div class="embed-responsive embed-responsive-16by9">
<img class="embed-responsive-item" src="Ricci Selfie Trim.png" alt="Ricci" id="camera--filter">
</div>
<!-- Camera trigger -->
<button id="camera--trigger">Take a Selfie</button>
</main>
<!-- Reference to your JavaScript file -->
<script src="app.js"></script>
</body>
</html>
html, body{
margin: 0;
padding: 0;
height: 100%;
width: 100%;
}
#camera, #camera--view, #camera--sensor, #camera--output{
position: fixed;
height: 100%;
width: 100%;
object-fit: cover;
}
#camera--view, #camera--sensor, #camera--output{
transform: scaleX(-1);
filter: FlipH;
}
#camera--filter{
height: 75%;
width: 75%;
position: absolute;
right: 0;
bottom: 0;
object-fit: contain;
}
#camera--trigger{
width: 200px;
background-color: black;
color: white;
font-size: 16px;
border-radius: 30px;
border: none;
padding: 15px 20px;
text-align: center;
box-shadow: 0 5px 10px 0 rgba(0,0,0,0.2);
position: fixed;
bottom: 30px;
left: calc(50% - 100px);
}
.taken{
height: 120px!important;
width: 90px!important;
transition: all 0.5s ease-in;
border: solid 3px white;
box-shadow: 0 5px 10px 0 rgba(0,0,0,0.2);
top: 20px;
right: 20px;
z-index: 2;
}
// Set constraints for the video stream
var constraints = { video: { facingMode: "user"}, audio: false };
// Define constants
const cameraView = document.querySelector("#camera--view"),
cameraOutput = document.querySelector("#camera--output"),
cameraSensor = document.querySelector("#camera--sensor"),
cameraFilter = document.querySelector("#camera--filter")
cameraTrigger = document.querySelector("#camera--trigger")
// Access the device camera and stream to cameraView
function cameraStart() {
navigator.mediaDevices
.getUserMedia(constraints)
.then(function(stream) {
track = stream.getTracks()[0];
cameraView.srcObject = stream;
})
.catch(function(error) {
console.error("Oops. Something is broken.", error);
});
}
// Take a picture when cameraTrigger is tapped
cameraTrigger.onclick = function() {
cameraSensor.width = cameraView.videoWidth;
cameraSensor.height = cameraView.videoHeight;
cameraSensor.getContext("2d").drawImage(cameraView, 0, 0);
cameraOutput.src = cameraSensor.toDataURL("image/webp");
cameraOutput.classList.add("taken");
};
// Start the video stream when the window loads
window.addEventListener("load", cameraStart, false);