以列表作为列将CS​​V导入熊猫数据框

时间:2019-12-09 19:44:24

标签: python-3.x pandas list csv import

嗨,我有一个看起来像这样的“ sudo” csv文件:

id, Wave ID, Time Stamp, Number of Samples, Sample Data Array
123, 317, 1567191561.8044672, 128, 79, 17, 162, 165, 66, 3, 40, 191, 68, 56, 59, 142, 143, 7, 150, 14, 120, 172, 76, 167, 55, 27, 198, 115, 50, 87, 38, 185, 199, 74, 43, 4, 133, 114, 89, 10, 136, 46, 85, 187, 182, 170, 149, 9, 25, 128, 39, 175, 102, 45, 33, 35, 129, 156, 20, 118, 108, 72, 111, 99, 122, 140, 93, 155, 54, 63, 189, 173, 171, 134, 163, 159, 91, 193, 64, 8, 97, 34, 80, 11, 121, 145, 190, 135, 144, 31, 29, 179, 125, 116, 196, 67, 152, 112, 148, 103, 132, 106, 78, 75, 28, 174, 119, 98, 110, 86, 123, 141, 84, 83, 178, 12, 169, 113, 48, 131, 52, 180, 100, 117, 6, 77, 69, 146, 18, 157, 127, 164
123, 20,  1567191562.0020044, 16, 779, 788, 801, 817, 835, 855, 875, 895, 916, 933, 946, 956, 963, 965, 962, 952
123, 20,  1567191561.8064446, 0,
123, 317, 1567191561.8044672, 100, 132, 48, 195, 78, 190, 124, 38, 99, 87, 1, 66, 6, 106, 18, 180, 197, 59, 148, 41, 128, 125, 194, 175, 81, 21, 115, 184, 30, 71, 77, 166, 3, 107, 114, 52, 55, 186, 5, 103, 145, 19, 8, 69, 64, 122, 90, 129, 83, 165, 79, 178, 2, 14, 74, 25, 133, 147, 158, 75, 146, 20, 140, 101, 97, 10, 143, 88, 50, 168, 112, 118, 9, 137, 155, 24, 89, 144, 16, 13, 156, 196, 113, 183, 34, 120, 142, 130, 49, 86, 46, 138, 191, 192, 189, 70, 123, 159, 108, 7, 95

所以前4列是普通的csv,然后剩下的就是一些长度的列表。 “样本数”列表示列表的长度,每行以换行符结束。

最终数据框如下所示:

id, Wave ID, Time Stamp, Sample Data Array
123, 317, 1567191561.8044672, [1,2,3,4,5,...]
123, 317, 1567191561.8044672, [1,2,3,4,5,...]
123, 20, 1567191561.8044672, []
123, 317, 1567191563.8044672, [1,2,3,4]

是否有某种方法可以在熊猫或其他应用中使用read_csv导入此文件?我写了一个简单的解析器,它逐行读取文件,但是速度很慢。希望最后有一个pandas数据框,这样我就可以对列进行分组/排序。

谢谢

1 个答案:

答案 0 :(得分:2)

您可以将数据读为列(使用保证在数据中不存在的分隔符),然后分成5列。然后,您可以删除最后一列,然后将最后一列转换为列表:

import pandas as pd
import io
import datetime

s="""id, Wave ID, Time Stamp, Number of Samples, Sample Data Array
123, 317, 1567191561.8044672, 128, 79, 17, 162, 165, 66, 3, 40, 191, 68, 56, 59, 142, 143, 7, 150, 14, 120, 172, 76, 167, 55, 27, 198, 115, 50, 87, 38, 185, 199, 74, 43, 4, 133, 114, 89, 10, 136, 46, 85, 187, 182, 170, 149, 9, 25, 128, 39, 175, 102, 45, 33, 35, 129, 156, 20, 118, 108, 72, 111, 99, 122, 140, 93, 155, 54, 63, 189, 173, 171, 134, 163, 159, 91, 193, 64, 8, 97, 34, 80, 11, 121, 145, 190, 135, 144, 31, 29, 179, 125, 116, 196, 67, 152, 112, 148, 103, 132, 106, 78, 75, 28, 174, 119, 98, 110, 86, 123, 141, 84, 83, 178, 12, 169, 113, 48, 131, 52, 180, 100, 117, 6, 77, 69, 146, 18, 157, 127, 164
123, 20,  1567191562.0020044, 16, 779, 788, 801, 817, 835, 855, 875, 895, 916, 933, 946, 956, 963, 965, 962, 952
123, 20,  1567191561.8064446, 0,
123, 317, 1567191561.8044672, 100, 132, 48, 195, 78, 190, 124, 38, 99, 87, 1, 66, 6, 106, 18, 180, 197, 59, 148, 41, 128, 125, 194, 175, 81, 21, 115, 184, 30, 71, 77, 166, 3, 107, 114, 52, 55, 186, 5, 103, 145, 19, 8, 69, 64, 122, 90, 129, 83, 165, 79, 178, 2, 14, 74, 25, 133, 147, 158, 75, 146, 20, 140, 101, 97, 10, 143, 88, 50, 168, 112, 118, 9, 137, 155, 24, 89, 144, 16, 13, 156, 196, 113, 183, 34, 120, 142, 130, 49, 86, 46, 138, 191, 192, 189, 70, 123, 159, 108, 7, 95"""

tmp = pd.read_csv(io.StringIO(s), sep='§', engine='python')
df = tmp.iloc[:,0].str.split(', *', 4, expand=True)
df.columns = [c.strip() for c in tmp.columns[0].split(',')]
df = df.drop('Number of Samples', 1)

df.id = df.id.astype(int)
df['Wave ID'] = df['Wave ID'].astype(int)
df['Time Stamp'] = df['Time Stamp'].astype(float).map(datetime.datetime.fromtimestamp)
df['Sample Data Array'] = df['Sample Data Array'].str.split(', *')

结果:

    id  Wave ID                 Time Stamp                                  Sample Data Array
0  123      317 2019-08-30 20:59:21.804467  [79, 17, 162, 165, 66, 3, 40, 191, 68, 56, 59,...
1  123       20 2019-08-30 20:59:22.002004  [779, 788, 801, 817, 835, 855, 875, 895, 916, ...
2  123       20 2019-08-30 20:59:21.806444                                                 []
3  123      317 2019-08-30 20:59:21.804467  [132, 48, 195, 78, 190, 124, 38, 99, 87, 1, 66...
相关问题