Unity-按住键时如何增加值?

时间:2019-02-22 16:55:03

标签: c# unity3d

   public int power;
// Start is called before the first frame update
void Start()
{
    player = GameObject.Find("Whyareyoulikethis");
    while (Input.GetKey(KeyCode.Space))
    {
        power = power + 10;
    }

    // Places the ball at the player's current position.
    transform.Translate(-player.transform.forward);
        rb = GetComponent<Rigidbody>();
        rb.AddForce(-player.transform.forward * power);
}

这是要在按住空格键的同时power增加10。不幸的是,这绝对没有任何作用。生成球时,它只是掉落而没有施加任何力。我也尝试过GetKeyUp和GetKeyDown而不是Getkey,但是它们对最终结果没有影响。我也在void Update()下的if语句中尝试过此操作,但同样发生了。尽管很愚蠢,但我还在void Update()下的while语句中尝试了此操作,并按预期使引擎崩溃。

1 个答案:

答案 0 :(得分:2)

while循环会阻塞您的游戏,直到完成为止。因此,一旦输入Input,就不会再出现了,因为power不会在while循环内更​​新。

此外,Start中没有意义,Input.GetKey仅在初始化GameObject并且不会在此处按下空格键时调用一次。

Update的支票移到Time.deltaTime,每一帧都被称为

比Cid的评论正确,这将大大提高功耗并取决于帧。您可能希望以独立于帧的60 /秒的速度增加,因此请使用{{3}}

在这种情况下,float应该是public float power; private void Start() { player = GameObject.Find("Whyareyoulikethis"); rb = GetComponent<Rigidbody>(); } private void Update() { if(Input.GetKey(KeyCode.Space)) { power += 10 * Time.deltaTime; } if(Input.GetKeyUp(KeyCode.Space)) { // Places the ball at the player's current position transform.position = player.transform.position; // you don't want to translate it but set it to the players position here // rather than using addforce you seem to simply want to set a certain velocity // though I don't understand why you shoot backwards... rb.velocity = -player.transform.forward * power; } }

这取决于其余部分应在哪里执行,但我想例如在按钮上

Exception in thread "main" org.apache.spark.sql.AnalysisException: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x;
    at org.apache.spark.sql.hive.HiveExternalCatalog.withClient(HiveExternalCatalog.scala:106)
    at org.apache.spark.sql.hive.HiveExternalCatalog.databaseExists(HiveExternalCatalog.scala:214)
    at org.apache.spark.sql.internal.SharedState.externalCatalog$lzycompute(SharedState.scala:114)
    at org.apache.spark.sql.internal.SharedState.externalCatalog(SharedState.scala:102)
Caused by: java.lang.RuntimeException: java.lang.RuntimeException: The root scratch dir: /tmp/hive on HDFS should be writable. Current permissions are: rwxrwxr-x
    at org.apache.hadoop.hive.ql.session.SessionState.start(SessionState.java:522)
    at org.apache.spark.sql.hive.client.HiveClientImpl.newState(HiveClientImpl.scala:183)
    at org.apache.spark.sql.hive.client.HiveClientImpl.<init>(HiveClientImpl.scala:117)
    at sun.reflect.NativeConstructorAccessorImpl.newInstance0(Native Method)