任何人都知道如何在EditorWindow中使用Resources.LoadAll或其他类似的方法?
我有这个错误:LoadAll can only be called from the main thread
。
通常我创建节点编辑器,我需要资源中的所有项目。
这就是我使用静态方法的方法:
foreach (var item in DataManager.Items.All<Food>())
{
_foodItems.Add(item.Name);
}
这是我的静态方法:
internal static IEnumerable<T> All<T>() where T : BaseItem
{
return Resources.LoadAll<T>(itemsPaths[typeof (T)]);
}
BaseItem是具有一些公共变量的公共抽象类。
public abstract class BaseItem : Datablock
{
#region Data
/// <summary>
/// Game item name.
/// </summary>
public string Name;
Ohhh和QuestTaskNode
我想要使用的课程Resources.LoadAll
继承自ScriptableObject
答案 0 :(得分:1)
为:
#
# Licensed to the Apache Software Foundation (ASF) under one or more
# contributor license agreements. See the NOTICE file distributed with
# this work for additional information regarding copyright ownership.
# The ASF licenses this file to You under the Apache License, Version 2.0
# (the "License"); you may not use this file except in compliance with
# the License. You may obtain a copy of the License at
#
# http://www.apache.org/licenses/LICENSE-2.0
#
# Unless required by applicable law or agreed to in writing, software
# distributed under the License is distributed on an "AS IS" BASIS,
# WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied.
# See the License for the specific language governing permissions and
# limitations under the License.
#
from __future__ import print_function
# $example on$
from pyspark.ml.feature import Word2Vec
# $example off$
from pyspark.sql import SparkSession
if __name__ == "__main__":
spark = SparkSession\
.builder\
.appName("Word2VecExample")\
.getOrCreate()
# $example on$
# Input data: Each row is a bag of words from a sentence or document.
documentDF = spark.createDataFrame([
("Hi I heard about Spark".split(" "), ),
("I wish Java could use case classes".split(" "), ),
("Logistic regression models are neat".split(" "), )
], ["text"])
# Learn a mapping from words to Vectors.
word2Vec = Word2Vec(vectorSize=3, minCount=0, inputCol="text", outputCol="result")
model = word2Vec.fit(documentDF)
result = model.transform(documentDF)
for feature in result.select("result").take(3):
print(feature)
# $example off$
spark.stop()
为:
Class BadCode{
Resources.LoadAll(....);
}
好:
Class BadCode{
BadCode(){
Resources.LoadAll(....);
}
}
只需将Unity API代码放在函数中。也不要将它放在继承自Class GoodCode{
void putUnityAPIInsideAFunction()
{
Resources.LoadAll(....);
}
}
的类构造函数中。您将得到相同的错误,因为这些错误未在主MonoBehaviour
上调用。