我在标题中遇到了问题,类似于 link。在这种情况下,我知道试图说明什么错误,但我想知道数据中的哪一行导致了问题?任何人都可以告诉我如何解决这个问题?
!pip install transformers
!pip install datasets
from transformers import BertTokenizer
from datasets import load_dataset
pos = '/content/drive/MyDrive/positive_preprocess.csv'
neg = '/content/drive/MyDrive/negative_preprocess.csv'
train, test = load_dataset("csv",data_files={"train":pos,"test":neg},split=['train', 'test'])
train=train.remove_columns(column_names=['Unnamed: 0', 'hashtag','label'])
test=test.remove_columns(column_names=['Unnamed: 0', 'hashtag','label'])
def tokenize_function(data):
return tokenizer(data["text"])
tokenized_train= train.map(tokenize_function,batched=True, num_proc=2)
tokenized_test= test.map(tokenize_function,batched=True, num_proc=2)
回溯:
RemoteTraceback Traceback (most recent call last)
RemoteTraceback:
"""
Traceback (most recent call last):
File "/usr/local/lib/python3.7/dist-packages/multiprocess/pool.py", line 121, in worker
result = (True, func(*args, **kwds))
File "/usr/local/lib/python3.7/dist-packages/datasets/arrow_dataset.py", line 174, in wrapper
out: Union["Dataset", "DatasetDict"] = func(self, *args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/datasets/fingerprint.py", line 340, in wrapper
out = func(self, *args, **kwargs)
File "/usr/local/lib/python3.7/dist-packages/datasets/arrow_dataset.py", line 1823, in _map_single
offset=offset,
File "/usr/local/lib/python3.7/dist-packages/datasets/arrow_dataset.py", line 1715, in apply_function_on_filtered_inputs
function(*fn_args, effective_indices, **fn_kwargs) if with_indices else function(*fn_args, **fn_kwargs)
File "<ipython-input-16-aa03da28a7e7>", line 2, in tokenize_function
return tokenizer(data["text"])
File "/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils_base.py", line 2271, in __call__
**kwargs,
File "/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils_base.py", line 2456, in batch_encode_plus
**kwargs,
File "/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils.py", line 545, in _batch_encode_plus
first_ids = get_input_ids(ids)
File "/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils.py", line 526, in get_input_ids
"Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers."
ValueError: Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.
"""
The above exception was the direct cause of the following exception:
ValueError Traceback (most recent call last)
<ipython-input-42-61ad0d3cfb1a> in <module>()
----> 1 tokenized_train= train.map(tokenize_function,batched=True, num_proc=2)
2 tokenized_test= test.map(tokenize_function,batched=True, num_proc=2)
12 frames
/usr/local/lib/python3.7/dist-packages/transformers/tokenization_utils.py in get_input_ids()
524 else:
525 raise ValueError(
--> 526 "Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers."
527 )
528
ValueError: Input is not valid. Should be a string, a list/tuple of strings or a list/tuple of integers.