我有一个清单。大多与讣告有关。
Leonard Wilson 1867 - 1936
Mark Jonson 1892 - 1961
Alex Jean Kinshaw 1951 - 1993
Elizabeth Mae Martin 1934 - 1998
需要对数据进行分析以进行研究,并且需要以“csv”格式排列,时间线(以“,”分隔,空值使用“-”表示)从 1850 年到 2015 年。
>Leonard Wilson,-,-,-,-,-,-,-,-,-,-,-,-,-,-,1867,1868,1869......1934,1935,1936,-,-,-,-,-,-,-,-,-,-,-,-
Mark Jonson,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,1892,1893,1894,1895,1896,1897......,1958,1959,1960,1961,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-
....
# All years in the middle needs to be populated please
在上面的数据中可以看到,人出生前的年份用'-'标记,死后的年份(到2015年)也一样。中间的所有年份都需要填充。
python/pandas 代码需要检测开始和结束的年份,以及
有什么办法可以实现吗,因为我有超过 30k 行的数据?
答案 0 :(得分:2)
是的,你可以这样做:
template< class Alloc >
function( std::allocator_arg_t, const Alloc& alloc,
std::nullptr_t ) noexcept;
template<
class CharT,
class Traits = std::char_traits<CharT>,
class Allocator = std::allocator<CharT>
> class basic_string;
template<class T,
class Allocator = std::allocator<T>
> class vector;
输出:
答案 1 :(得分:2)
另一种方式,只需逐行处理:
import pandas as pd
import io
df_str = '''
dataLeonard Wilson 1867 - 1936
Mark Jonson 1892 - 1961
Alex Jean Kinshaw 1951 - 1993
Elizabeth Mae Martin 1934 - 1998
'''
obj = pd.read_csv(io.StringIO(df_str.strip()),
sep='\n',
index_col=False,
header=None)
df = obj[0].str.rsplit(' ', 3, expand=True)
df.columns=['name', 'start_yr', '-', 'end_yr']
print(df)
# name start_yr - end_yr
# 0 dataLeonard Wilson 1867 - 1936
# 1 Mark Jonson 1892 - 1961
# 2 Alex Jean Kinshaw 1951 - 1993
# 3 Elizabeth Mae Martin 1934 - 1998
# conver to int column
df[['start_yr', 'end_yr']] = df[['start_yr', 'end_yr']].astype(int)
# iterrows
# expand the start_year and end_year
dfn_list = list()
for _, row in df.iterrows():
name = row['name']
start_yr = row['start_yr']
end_yr = row['end_yr']
dfn = pd.DataFrame(list(range(start_yr, end_yr + 1)), columns=['yr'])
dfn['name'] = name
dfn['tag'] = dfn['yr'].astype(str)
dfn_list.append(dfn)
# merge
dfm = pd.concat(dfn_list)
print(dfm.head())
# yr name tag
# 0 1867 dataLeonard Wilson 1867
# 1 1868 dataLeonard Wilson 1868
# 2 1869 dataLeonard Wilson 1869
# 3 1870 dataLeonard Wilson 1870
# 4 1871 dataLeonard Wilson 1871
# transformat
dfm = dfm.set_index(['name', 'yr'])['tag'].unstack(fill_value='-')
dfm.to_csv('test.csv', header=None)
!cat test.csv
结果:
Alex Jean Kinshaw,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,1962,1963,1964,1965,1966,1967,1968,1969,1970,1971,1972,1973,1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,-,-,-,-,-
Elizabeth Mae Martin,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,1934,1935,1936,1937,1938,1939,1940,1941,1942,1943,1944,1945,1946,1947,1948,1949,1950,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,1962,1963,1964,1965,1966,1967,1968,1969,1970,1971,1972,1973,1974,1975,1976,1977,1978,1979,1980,1981,1982,1983,1984,1985,1986,1987,1988,1989,1990,1991,1992,1993,1994,1995,1996,1997,1998
Mark Jonson,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,1892,1893,1894,1895,1896,1897,1898,1899,1900,1901,1902,1903,1904,1905,1906,1907,1908,1909,1910,1911,1912,1913,1914,1915,1916,1917,1918,1919,1920,1921,1922,1923,1924,1925,1926,1927,1928,1929,1930,1931,1932,1933,1934,1935,1936,1937,1938,1939,1940,1941,1942,1943,1944,1945,1946,1947,1948,1949,1950,1951,1952,1953,1954,1955,1956,1957,1958,1959,1960,1961,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-
dataLeonard Wilson,1867,1868,1869,1870,1871,1872,1873,1874,1875,1876,1877,1878,1879,1880,1881,1882,1883,1884,1885,1886,1887,1888,1889,1890,1891,1892,1893,1894,1895,1896,1897,1898,1899,1900,1901,1902,1903,1904,1905,1906,1907,1908,1909,1910,1911,1912,1913,1914,1915,1916,1917,1918,1919,1920,1921,1922,1923,1924,1925,1926,1927,1928,1929,1930,1931,1932,1933,1934,1935,1936,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-,-