好的,所以我正在构建一个Base64编码器/解码器,它将十六进制转换为base64并返回,但我发现了一个奇怪的问题,我试图理解,这里的代码是:
string b64_encode(string str)
{
string newStr = "";
string ref = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
unsigned long long h = 0;
for(int i=0; i<str.size(); i+=3)
{
//Get every 3 chars
char a = str[i];
char b = str[i+1];
char c = str[i+2];
//Now, convert each hex character (base 16) to it's equivalent decimal number
//and merge them into one variable
h = strtoull(&a, nullptr, 16) << 8; //shift left by 8 bits
h |= strtoull(&b, nullptr, 16) << 4; //shift left by 4 bits
h |= strtoull(&c, nullptr, 16); //no shift required only the first 2 characters need
cout << h << endl; //for testing purposes only
}
return newStr;
}
当我在Mac OSX上运行此代码时,我得到以下结果,这是错误的:
4052
3959
1570
4091
3814
...
但是,我在Windows 8中的Visual Studio 2013上编写了相同的代码,它为我提供了正确的值:
1170
1901
518
2921
1734
...
我使用的十六进制字符串:
string str = "49276d206b696c6c696e6720796f757220627261696e206c696b65206120706f69736f6e6f7573206d757368726f6f6d";
所以我的问题是,有没有办法在Mac OSX上显示正确的数字?我在网上查了一下,但它没什么帮助。
答案 0 :(得分:0)
所以根据@mch,问题是字符不是0终止的,它调用了未定义的行为,多亏了他,这个问题是修正的:
string b64_encode(string str)
{
string newStr = "";
string ref = "ABCDEFGHIJKLMNOPQRSTUVWXYZabcdefghijklmnopqrstuvwxyz0123456789+/";
unsigned long long h = 0;
for(int i=0; i<str.size(); i+=3)
{
//Get every 3 chars
char a[2] = {str[i], 0};
char b[2] = {str[i+1], 0};
char c[2] = {str[i+2], 0};
//Now, convert each hex character (base 16) to it's equivalent decimal number
//and merge them into one variable
h = strtoull(a, nullptr, 16) << 8; //shift left by 8 bits
h |= strtoull(b, nullptr, 16) << 4; //shift left by 4 bits
h |= strtoull(c, nullptr, 16); //no shift required only the first 2 characters need
cout << h << endl;
}
return newStr;
}