我使用([1, 7, 2, 6], [1, 6, 3, 4, 4])
([1, 10], [1, 10])
无法正确将char数组转换为十六进制。
以下是一些示例代码:
std::hex
这是输出:
char data[240] = ...
for(int i = 0; i < sizeof(data); ++i) {
printf("%02X", (unsigned char)data[i]);
}
printf("\n\n");
for(int i = 0; i < sizeof(data); ++i) {
std::cout << std::hex << std::setfill('0') << std::setw(2) << std::uppercase << (unsigned short)data[i];
}
正如您所看到的,使用14000000E9DE05629F963FE5EC98760401001001EFA8C7561800000001000000020000004494C6924001A8C02C6EFA003A000000B80000003800000004000000EC9876040100100116DC0300BE6D5A924001A8C000000000E33FC65663EFE1560100E4780000010000FA050000000000B6294CC5BC4AAF09DB1F96EAD3EC3E333EF46A46D76E700CDACE12916F2764EB107F3D47FEF46A757BDAA2EEB87523B5CEA526810102E0AF74B55CB172ACF160FC9314B80AFD1D3F8DF143CAA17575A1FE788BBD599B747B33F2BAEC09B18DF4AF7E91D59270FB815C865D25FFDCE794C7F87671207036E060FFAC7E65F61999
14000000FFE9FFDE0562FF9FFF963FFFE5FFECFF98760401001001FFEFFFA8FFC75618000000010000000200000044FF94FFC6FF924001FFA8FFC02C6EFFFA003A000000FFB80000003800000004000000FFECFF9876040100100116FFDC0300FFBE6D5AFF924001FFA8FFC000000000FFE33FFFC65663FFEFFFE1560100FFE4780000010000FFFA050000000000FFB6294CFFC5FFBC4AFFAF09FFDB1FFF96FFEAFFD3FFEC3E333EFFF46A46FFD76E700CFFDAFFCE12FF916F2764FFEB107F3D47FFFEFFF46A757BFFDAFFA2FFEEFFB87523FFB5FFCEFFA526FF810102FFE0FFAF74FFB55CFFB172FFACFFF160FFFCFF9314FFB80AFFFD1D3FFF8DFFF143FFCAFFA17575FFA1FFFE78FF8BFFBD59FF9B747B33FFF2FFBAFFEC09FFB1FF8DFFF4FFAF7EFF91FFD5FF9270FFFBFF815CFF865D25FFFFFFDCFFE7FF94FFC7FFF87671207036FFE060FFFFFFAC7E65FFF619FF9999
的结果与正确的std::hex
版本不匹配。 printf
正插入不应该的地方。我做错了什么?