如何在编译时删除“ GLIBC_2.27”要求?

时间:2019-03-07 16:29:51

标签: gcc glibc dynamic-linking ldd

我一直在使用docker映像进行c ++编译。它基于Ubuntu 18.04。当我尝试在某些Ubuntu 16系统上运行时,出现以下消息:

/lib/x86_64-linux-gnu/libm.so.6:未找到版本“ GLIBC_2.27”

我将在下面发布完整的ldd输出。我喜欢使用较新的编译器。我宁愿不使用较旧的Linux基本映像进行编译(但如有必要,我会这样做)。我静态链接了大多数库,但是我还没有静态链接glibc。许多网络资源都建议不要这样做。有什么办法可以告诉我的较新的编译器(gcc 7.3)不需要较新的glibc? ldd -v输出:

    linux-vdso.so.1 (0x00007ffd167cf000)
    libpthread.so.0 => /lib/x86_64-linux-gnu/libpthread.so.0 (0x00007eff77399000)
    librt.so.1 => /lib/x86_64-linux-gnu/librt.so.1 (0x00007eff77191000)
    libm.so.6 => /lib/x86_64-linux-gnu/libm.so.6 (0x00007eff76df3000)
    libgcc_s.so.1 => /lib/x86_64-linux-gnu/libgcc_s.so.1 (0x00007eff76bdb000)
    libc.so.6 => /lib/x86_64-linux-gnu/libc.so.6 (0x00007eff767ea000)
    /lib64/ld-linux-x86-64.so.2 (0x00007eff79f90000)

    Version information:
    lbrycrd-linux (4)/lbrycrdd:
        ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
        librt.so.1 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/librt.so.1
        libm.so.6 (GLIBC_2.27) => /lib/x86_64-linux-gnu/libm.so.6
        libm.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libm.so.6
        libgcc_s.so.1 (GCC_3.3) => /lib/x86_64-linux-gnu/libgcc_s.so.1
        libgcc_s.so.1 (GCC_3.0) => /lib/x86_64-linux-gnu/libgcc_s.so.1
        libgcc_s.so.1 (GCC_4.2.0) => /lib/x86_64-linux-gnu/libgcc_s.so.1
        libpthread.so.0 (GLIBC_2.3.4) => /lib/x86_64-linux-gnu/libpthread.so.0
        libpthread.so.0 (GLIBC_2.3.3) => /lib/x86_64-linux-gnu/libpthread.so.0
        libpthread.so.0 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libpthread.so.0
        libpthread.so.0 (GLIBC_2.3.2) => /lib/x86_64-linux-gnu/libpthread.so.0
        libc.so.6 (GLIBC_2.15) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.4) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.14) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.8) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.7) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.9) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.10) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.3.2) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.3) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.3.4) => /lib/x86_64-linux-gnu/libc.so.6
    /lib/x86_64-linux-gnu/libpthread.so.0:
        ld-linux-x86-64.so.2 (GLIBC_2.2.5) => /lib64/ld-linux-x86-64.so.2
        ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
        libc.so.6 (GLIBC_2.14) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.3.2) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.4) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_PRIVATE) => /lib/x86_64-linux-gnu/libc.so.6
    /lib/x86_64-linux-gnu/librt.so.1:
        libpthread.so.0 (GLIBC_2.3.2) => /lib/x86_64-linux-gnu/libpthread.so.0
        libpthread.so.0 (GLIBC_PRIVATE) => /lib/x86_64-linux-gnu/libpthread.so.0
        libpthread.so.0 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libpthread.so.0
        libc.so.6 (GLIBC_2.14) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.3.2) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.4) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_PRIVATE) => /lib/x86_64-linux-gnu/libc.so.6
    /lib/x86_64-linux-gnu/libm.so.6:
        ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2
        libc.so.6 (GLIBC_2.4) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_PRIVATE) => /lib/x86_64-linux-gnu/libc.so.6
    /lib/x86_64-linux-gnu/libgcc_s.so.1:
        libc.so.6 (GLIBC_2.14) => /lib/x86_64-linux-gnu/libc.so.6
        libc.so.6 (GLIBC_2.2.5) => /lib/x86_64-linux-gnu/libc.so.6
    /lib/x86_64-linux-gnu/libc.so.6:
        ld-linux-x86-64.so.2 (GLIBC_2.3) => /lib64/ld-linux-x86-64.so.2
        ld-linux-x86-64.so.2 (GLIBC_PRIVATE) => /lib64/ld-linux-x86-64.so.2

3 个答案:

答案 0 :(得分:0)

您需要针对较早的glibc版本进行构建。开箱即用的发行版很少(如果有的话)支持。目前唯一可行的方法是建立在较旧的发行版上。

某些支持周期较长的发行版提供了不需要较新系统编译器的较新GCC版本(例如CentOSRed Hat Enterprise Linux可用的Developer Toolset)。

答案 1 :(得分:0)

我遇到了同样的问题。

/lib/i386-linux-gnu/libm.so.6: version `GLIBC_2.27' not found (required by your_lib.so) 

在互联网上搜索后,我发现了一些可能有用的链接。

静态链接: 与-static链接。参见How can I link to a specific glibc version?

使用docker:参见Can docker solve a problem of mismatched C shared libraries?

专有解决方案:请参见https://github.com/wheybags/glibc_version_header

我决定遵循第一个。因此,我为 your_lib.so 创建了一个静态库,并将其与我的二进制文件静态链接。

答案 2 :(得分:0)

尝试symbol versioning,例如:

import pandas as pd
import matplotlib.pyplot as plt
import numpy as np
import seaborn as sns
from sklearn.model_selection import train_test_split
from sklearn import preprocessing
from sklearn.preprocessing import StandardScaler
from sklearn.ensemble import RandomForestRegressor
from sklearn.ensemble import RandomForestClassifier
from sklearn import metrics
from sklearn.metrics import classification_report, confusion_matrix, accuracy_score

train = pd.read_csv(r"C:\train - Copy.csv")
test = pd.read_csv(r"C:\test.csv")

# CHECKING SHAPE OF THE DATA
print("Train Data shape", train.shape,"\n Test Data shape",test.shape)

# Save the 'Id' column
train_ID = train['id']
test_ID = test['id']

# Droping COLUMNS WHICH HAS NO IMPACT ON DATA
train = train.drop(['id', 'thumbnail_url'], axis=1)
test = test.drop(['id', 'thumbnail_url'], axis=1)

# Check data size after dropping no impact variables
print("\nThe train data size after dropping features is : {} ".format(train.shape))
print("The test data size after dropping featurea is : {} ".format(test.shape))

# Checking Categorical Data
C_data = train.select_dtypes(include=['object']).columns
print("Categorical Data", C_data)

# Checking Numerical Data
N_data = train.select_dtypes(include=['int64', 'float64']).columns
print("Numerical Data", N_data)

# Combining Datasets
ntrain = train.shape[0]
ntest = test.shape[0]
#y_train = train.log_price.values
y = train.log_price.values

print(ntrain)
print(ntest)
print(y)

all_data = pd.concat((train, test),sort='true').reset_index(drop=True)
print(all_data.shape)
all_data = all_data.drop(['log_price'], axis=1)
print(all_data.shape)


# Find Missing Ratio of Dataset
null_values = all_data.isnull().sum()
# print(null_values)

# IMPUTING NULL VALUES
all_data = all_data.dropna(subset=['host_since'])
all_data['bathrooms'] = all_data['bathrooms'].fillna(all_data['bathrooms'].mean())
all_data['bedrooms'] = all_data['bedrooms'].fillna(all_data['bedrooms'].mean())
all_data['beds'] = all_data['beds'].fillna(all_data['beds'].mean())
all_data['review_scores_rating'] = all_data['review_scores_rating'].fillna(all_data['review_scores_rating'].mean())
all_data['host_response_rate'] = all_data['host_response_rate'].fillna('None')
all_data['neighbourhood'] = all_data['neighbourhood'].fillna('None')
all_data['host_has_profile_pic'] = all_data['host_has_profile_pic'].fillna('f')
all_data['host_identity_verified'] = all_data['host_identity_verified'].fillna('f')
all_data['description'] = all_data['description'].fillna('None')
all_data['first_review'] = all_data['first_review'].fillna('None')
all_data['last_review'] = all_data['last_review'].fillna('None')
all_data['name'] = all_data['name'].fillna('None')
all_data['zipcode'] = all_data['zipcode'].fillna('None')

# Check if Missing values left
post_null_values = all_data.isnull().sum().sum()
print("post_null_values\n", post_null_values)

print("-----------------------------------------------------------------------------------------------")

# apply LabelEncoder to categorical features
from sklearn.preprocessing import LabelEncoder
cols = ('property_type', 'room_type', 'amenities', 'bed_type',
       'cancellation_policy', 'city', 'description', 'first_review',
       'host_has_profile_pic', 'host_identity_verified', 'host_response_rate',
       'host_since', 'instant_bookable', 'last_review', 'name',
       'neighbourhood', 'zipcode')
for c in cols:
    lbl = LabelEncoder()
    lbl.fit(list(all_data[c].values))
    all_data[c] = lbl.transform(list(all_data[c].values))

# creating matrices for sklearn:
X = all_data[:ntrain]
test_values = all_data[ntrain:]

print("X col", X.columns, "X shape", X.shape)

# import train test split
from sklearn.model_selection import train_test_split
from sklearn.linear_model import LinearRegression
from sklearn.linear_model import Lasso


X_train, X_test, y_train, y_test = train_test_split(X, y, random_state=0)

#clf = LinearRegression()
clf = Lasso()

clf.fit(X_train, y_train)

y_pred = clf.predict(X_test)
y_train_pred = clf.predict(X_train)

from sklearn.metrics import r2_score

print("Train acc: " , r2_score(y_train, y_train_pred))
print("Test acc: ", r2_score(y_test, y_pred))

from sklearn.metrics import mean_squared_error

print("Train acc: " , clf.score(X_train, y_train))
print("Test acc: ", clf.score(X_test, y_test))

相关问题:Fedora 28 / GLIBC 2.27 libm.so.6 logf() and powf() c++