在64位草莓perl安装上使用perl的IO :: Handle :: Sync需要帮助

时间:2015-06-19 13:41:19

标签: perl server

我是一名没有perl经验的.net开发人员。我必须在我公司的64位Windows Server 2012上设置一个perl脚本作为计划任务。该脚本由公司的另一个部门编写,现在我的部门必须接管处理它。 Strawberry Perl(64位)5.20.2.1-64bit安装在服务器上。我已经设法弄清楚如何安装perl,更改程序中的信息等,以便它都指向新服务器。

当我尝试运行脚本时,程序尝试从csv读取数据并将其挂载到数据库时出现此错误:“IO :: Handle :: sync未在此体系结构上实现”。

IO :: Handle已安装在服务器上,但我无法安装IO :: Handle :: sync。我认为这与64位服务器有关吗?

我不太了解perl,感觉很舒服地更改脚本以使用不同的模块,而且在设置之前我没有足够的时间学习语言。有什么办法可以让IO :: Handle :: sync在这个系统上运行吗?我可以在64位服务器上安装32位Strawberry Perl吗?如果是这样,那会解决问题吗?

以下是我遇到问题的功能:

# Convert CSV data file to bulk insert format.
sub convert_data_file ($$$$$$$) {
    my ($omniture_dbh, $omniture_mappings, $feed, $file, $basename, $table, $rsid) = @_;

# Open CSV data file.
print "Processing data file: $file";
open my $in, "<:encoding(utf8)", $file or die "$file: $!\n";

# Remember start time.
my $start = time;

# Discard byte-order mark if present.
seek $in, 0, 0 if sysread $in, $_, 1 and $_ ne "\x{FEFF}";

# Initialize CSV parser.
my $csv = Text::CSV_XS->new({ binary => 1, auto_diag => 2, allow_loose_quotes => 1, allow_loose_escapes => 1, allow_whitespace => 1 });

# Read header row from CSV data file.
my $header = $csv->getline($in) or die "Header line missing!\n";

# CSV row.
my $row = {};

# Bind CSV data to hash elements by header field name.
$csv->bind_columns(\@{$row}{@{$header}});

# Map CSV fields to database columns.
my ($column_source, $hooks) = map_csv_fields $omniture_dbh, $omniture_mappings, $feed, $header, $row;

# Map database columns for bulk insert.
my $fields = map_database_columns $omniture_dbh, $table, $row, $basename, $rsid, $feed->{feed_version}, $column_source;

# Use ".data" extension for bulk insert data file.
my $bulk_insert_file = "$basename.data";

# Open bulk insert data file.
open my $out, ">:encoding(UCS-2LE)", $bulk_insert_file or die "$bulk_insert_file: $!\n";

# Write Byte-Order Mark (BOM).
print $out "\x{FEFF}";

# Record counter.
my $records = 0;

# Read data rows from CSV data file.
while ($csv->getline($in)) {
    # Call hooks as necessary.
    $_->() foreach @{$hooks};

    # Create bulk insert data record from mapped data values.
    $_ = join "|~|", map { ${$_}; } @{$fields};

    # Unescape hex escapes.
    s/\\x([A-Fa-f0-9]{2})/pack "C", hex $1/eg;

    # Strip ASCII control codes (except newline/tab) and invalid UCS-2 characters.
    {
       no warnings;
       tr/\n\t\x{0020}-\x{d7ff}\x{e000}-\x{ffff}//cd;
    }

    # Unescape backslashes, newlines and tabs.
    s/\\(\\|\n|\t)/$1/g;

    # Write data record and record terminator to bulk insert data file.
    print $out "$_|~~|\n";

    # Increment record counter.
    $records++;
 }

 # Close CSV data file.
 close $in or die "$file: $!";

 # Flush output buffers.
 $out->flush;

 # Sync file to disk.
 $out->sync;

 # Close bulk insert data file.
 close $out or die "$bulk_insert_file: $!";

 # Print informational message.
 printf " (%d records converted (%.2f seconds)", $records, time - $start;


 # Return variables of interest.
 return $column_source, $bulk_insert_file, $records;
}

0 个答案:

没有答案