没有这样的文件或目录加载libssl.so.1.0.0

时间:2018-08-24 10:01:46

标签: amazon-web-services go apache-kafka aws-lambda

我正在尝试在AWS上制作我的第一个Go lambda:

我的代码很简单:

package main

import (
    "bufio"
    "context"
    "fmt"
    "github.com/aws/aws-lambda-go/events"
    "github.com/aws/aws-lambda-go/lambda"
    "github.com/confluentinc/confluent-kafka-go/kafka"
    "github.com/droundy/goopt"
    "os"
    "path/filepath"
)

func badUsage() {
    fmt.Println(goopt.Usage())
    os.Exit(1)
}

func HandleRequest(ctx context.Context, s3Event events.S3Event) {
    for _, record := range s3Event.Records {
        s3 := record.S3
        fmt.Printf("[%s - %s] Bucket = %s, Key = %s \n", record.EventSource, record.EventTime, s3.Bucket.Name, s3.Object.Key)
    }
}

func main() {
    lambda.Start(HandleRequest)
}

注1:我有一些使用kafka的功能,可以在kafka中产生一些消息

我在自己的PC上进行编译:

GOOS=linux GOARCH=amd64 go build -tags static -o goklog producer.go

但是我总是遇到类似问题(在cloudwatch上):

error while loading shared libraries: libssl.so.1.0.0: cannot open shared object file: No such file or directory

编辑2: 我以为static_all很不稳定。我尝试过,并且已经:

➜  goklog git:(master) ✗ GOOS=linux GOARCH=amd64 go build -tags static_all
# github.com/confluentinc/confluent-kafka-go/kafka
/usr/lib/librdkafka.a(rddl.o): In function `rd_dl_open0':
/home/maathor/dev/librdkafka/src/rddl.c:80: warning: Using 'dlopen' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking
/usr/lib/librdkafka.a(rdaddr.o): In function `rd_getaddrinfo':
/home/maathor/dev/librdkafka/src/rdaddr.c:168: warning: Using 'getaddrinfo' in statically linked applications requires at runtime the shared libraries from the glibc version used for linking
# github.com/confluentinc/confluent-kafka-go/kafka
cannot load imported symbols from ELF file $WORK/github.com/confluentinc/confluent-kafka-go/kafka/_obj/_cgo_.o: no symbol section

3 个答案:

答案 0 :(得分:2)

这是因为您使用了“ github.com/confluentinc/confluent-kafka-go/kafka”软件包。这需要librdkafka C库,这些库假定您的计算机上有某些库(如libssl库)可用(请参阅链接中的页面,位于“需求”下)。

答案 1 :(得分:1)

我建议在类似lambda的docker映像之一上构建您的lambda软件包,例如https://github.com/lambci/docker-lambda

这样,您构建的librdkafka将取决于lambda目标主机上可用的相同库。

答案 2 :(得分:0)

RC是AWS Lambda仅支持glibc运行时动态链接,但librdkafka具有sasl2,openssl,zlib,zstd动态依赖项。 即使您使用go build -tags static -o goklog producer.go来确保librdkafka是静态链接到二进制文件的,但上面的4个依赖项也需要动态链接。

解决方案: 1.使用./configure --install-deps --source-deps-only --prefix=/usr && make && make install

编译librdkafka
  1. 编译静态librdkafka以生成libssl.a和libzstd.a ./configure --enable-static --install-deps --source-deps-only --prefix=/usr && make

  2. 编译zlib和cyrus-sasl生成libz.a和libsasl2.a

// zlib path
CFLAGS="-O3 -fPIC" ./configure --libdir=/usr/lib --sharedlibdir=/usr/lib
make
make install
// cyrus-sasl
./configure --enable-static --libdir=/usr/lib
  1. 使用
  2. 设置LD PATH和PKG PATH
export PKG_CONFIG_PATH=($pwd)/lib/pkgconfig 
export LD_LIBRARY_PATH=($pwd)/lib
  1. 将所有.a文件复制到lib文件夹
cp librdkafka_folder/mklove/deps/dest/libcrypto/usr/lib64/*.a ($pwd)/lib/
cp librdkafka_folder/mklove/deps/dest/libzstd/usr/lib64/*.a ($pwd)/lib/
cp /usr/lib/pkgconfig/rdkafka* ($pwd)/lib/pkgconfig/
cp /usr/lib/librdkafka* ($pwd)/lib/
cp /usr/lib/libz.a ($pwd)/lib/
cp /usr/lib/libsasl2.a ($pwd)/lib/
  1. 开始构建
go build -tags static -ldflags "-extldflags '-Lz -Lssl -Lsasl2 -Lzstd'" -o test
  1. ldd检查动态链接
ldd test
// sample output
linux-vdso.so.1 =>  (0x00007ffea7ee0000)
libm.so.6 => /lib64/libm.so.6 (0x00007f772c287000)
libdl.so.2 => /lib64/libdl.so.2 (0x00007f772c083000)
libpthread.so.0 => /lib64/libpthread.so.0 (0x00007f772be67000)
librt.so.1 => /lib64/librt.so.1 (0x00007f772bc5f000)
libc.so.6 => /lib64/libc.so.6 (0x00007f772b892000)
/lib64/ld-linux-x86-64.so.2 (0x00007f772c589000)