我跟随Scala Slick beginner guide试图创建一个简单的架构,而我似乎无法找到'列'在文档开头导入东西时键入。
import slick.driver.H2Driver.api._
import scala.concurrent.ExecutionContext.Implicits.global
/**
* Created by chris on 9/7/16.
*/
class BlockHeaderTable(tag: Tag) extends Table[BlockHeader](tag,"block_headers") {
def version: column[UInt32]
def previousBlockHash: column[DoubleSha256Digest]
def merkleRootHash: column[DoubleSha256Digest]
def time: column[UInt32]
def nBits: column[UInt32]
def nonce: column[UInt32]
}
这是我得到的错误:
chris @ chris-870Z5E-880Z5E-680Z5E:〜/ dev / bitcoins-spv-node $ sbt compile [info]从中加载项目定义 / home / chris / dev / bitcoins-spv-node / project [info]设置当前项目 比特币-spv-node(在构建中 file:/ home / chris / dev / bitcoins -spv-node /)[info]编译1个Scala 来源 /home/chris/dev/bitcoins-spv-node/target/scala-2.11/classes ... [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:14: 未找到:type column [error] def version:column [UInt32] [error]
^ [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:16: 找不到:type column [error] def previousBlockHash: 专栏[DoubleSha256Digest] [错误] ^ [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:18: 找不到:type column [error] def merkleRootHash: 专栏[DoubleSha256Digest] [错误] ^ [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:20: 找不到:键入列[错误] def time:列[UInt32] [错误]
^ [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:22: 未找到:type column [error] def nBits:column [UInt32] [error]
^ [错误] /home/chris/dev/bitcoins-spv-node/src/main/scala/org/bitcoins/spvnode/models/BlockHeaderTable.scala:24: 未找到:type column [error] def nonce:column [UInt32] [error]
^ [错误]找到6个错误[错误](编译:compileIncremental) 编译失败
答案 0 :(得分:2)
列的类型不是column
,而是Rep
。 column
实际上是一个函数,它告诉使用哪个列:
class BlockHeaderTable(tag: Tag) extends Table[BlockHeader](tag,"block_headers") {
def version: Rep[UInt32] = column[UInt32]("version")
def previousBlockHash: Rep[DoubleSha256Digest] = column[DoubleSha256Digest]("previous_block_hash")
...
}
此外,我还不确定您使用的是哪种类型,但是光滑的see here不支持开箱即用。您将需要编写自定义类型映射器。例如,UInt32映射器:
implicit val UInt32Mapper = MappedColumnType.base[UInt32, Long](
u => u.toLong, // convert UInt32 to Long here
l => UInt32(l) // and Long to UInt32 here
)
答案 1 :(得分:0)
Slick不会理解除标准JDBC类型之外的自定义类型,如Timestamp,Long,String,Char,Boolean等。为了使用自定义类型,您必须提供自定义类型的Slick Mapping到jdbc类型。
为UInt32
和DoubleSha256Digest
例如
DateTime
是自定义类型,光滑无法理解,但是光滑理解java.sql.Timestamp
。我们提供了一个灵活的映射。因此,光滑的人可以理解如何处理DateTime
implicit def jodaTimeMapping: BaseColumnType[DateTime] = MappedColumnType.base[DateTime, Timestamp](
dateTime => new Timestamp(dateTime.getMillis),
timeStamp => new DateTime(timeStamp.getTime))
完成示例
case class Foo(str: String) //Foo needs Slick Mapping for below code to compile
implicit def fooMapping: BaseColumnType[Foo] = MappedColumnType.base[Foo, String](
str => Foo(str),
foo => foo.str)
case class Person(name: String, foo: Foo)
class Persons(tag) extends Table[Person](tag, "persons") {
def name = column[String]("name")
def foo = column[Foo]("foo") //use Foo directly because Implicit mapping is available in scope
def * = (name, foo) <> (Person.tupled, Person.unapply)
}