如何遍历2列并一一匹配

时间:2020-06-02 10:08:28

标签: python pandas loops dataframe sequencematcher

假设我有2个excel文件,每个文件都包含一列名称和日期

Excel 1:

Name
0      Bla bla bla June 04 2018 
1      Puppy Dog June 01 2017
2      Donald Duck February 24 2017
3      Bruno Venus April 24 2019

Excel 2:

                             Name
0        Pluto Feb 09 2019
1        Donald Glover Feb 22 2020
2        Dog Feb 22 2020
3        Bla Bla Feb 22 2020

我想将第1列中的每个单元格与第2列中的每个单元格进行匹配,然后找到最大的相似性。

以下函数将给出两个输入相互匹配的百分比值。

SequenceMatcher代码示例:

from difflib import SequenceMatcher

def similar(a, b):
    return SequenceMatcher(None, a, b).ratio()


x = "Adam Clausen a Feb 09 2019"
y = "Adam Clausen Feb 08 2019"
print(similar(x,y))

输出:0.92

2 个答案:

答案 0 :(得分:1)

如果您知道如何将列作为数据帧加载..此代码应该可以完成您的工作..

from difflib import SequenceMatcher

col_1 = ['potato','tomato', 'apple']
col_2 = ['tomatoe','potatao','appel']

def similar(a,b):
    ratio = SequenceMatcher(None, a, b).ratio()
    matches = a, b
    return ratio, matches

for i in col_1:
    print(max(similar(i,j) for j in col_2))

答案 1 :(得分:0)

已更新/已解决的零件

以下代码可以:

  • 它将2个输入文件转换为数据帧
  • 然后它将使用一个特定的列(在这种情况下,它们都称为Name)并将其用作匹配输入
  • 它从文件1中取一个名字,并遍历文件2中的所有名字
  • 然后使用匹配度最高的名称并将其各自的行保存并在输出文件中彼此相邻保存

代码:

import pandas as pd
import numpy as np
from difflib import SequenceMatcher

def similar(a, b):
    ratio = SequenceMatcher(None, a, b).ratio()
    return ratio

#Load Batchlog to Data frame

data1 = pd.read_excel (r'File1.xlsx')
data2 = pd.read_excel (r'File2.xlsx')

df1 = pd.DataFrame(data1)
df2 = pd.DataFrame(data2)

df1['Name'] = df1['Name'].astype(str)
df2['Name'] = df2['Name'].astype(str)

#Function/LOOP
order = []
for index, row in df1.iterrows():
    maxima = [similar(row['Name'], j) for j in df2['Name']]

#best_Ratio=Best Match
    best_ratio = max(maxima)
    best_row = np.argmax(maxima)

#Rearrange new order and save in Output File
    order.append(best_row)

df2 = df2.iloc[order].reset_index()

pd.concat([df1, df2], axis=1)

dfFinal=pd.concat([df1, df2], axis=1)

dfFinal.to_excel("OUTPUT.xlsx")  
#Thank you for the help!