使用WM_CHAR键入Unicode符号

时间:2015-03-03 15:04:41

标签: c++ winapi unicode

如何使用WM_CHAR消息键入一些Unicode符号(如西里尔字母)?现在我有不正确的西里尔符号输入。 这是我的代码:

DWORD dwCurrentTreadID = GetCurrentThreadId();
HWND hForeground = GetForegroundWindow();
DWORD dwForegroungThreadID = GetWindowThreadProcessId(hForeground, NULL);
AttachThreadInput(dwForegroungThreadID,dwCurrentTreadID,true);
PostMessageW(GetFocus(), WM_CHAR, character, 1);

1 个答案:

答案 0 :(得分:1)

You can't simulate keyboard input with PostMessage。请改用SendInput()

INPUT input = {0};

input.type = INPUT_KEYBOARD;
input.ki.wScan = (WORD) character;
input.ki.dwFlags = KEYEVENTF_UNICODE;

SendInput(1, &input, sizeof(INPUT));

Windows中的Unicode使用UTF-16。 wScan是16位,因此它只能保存单独的UTF-16代码单元。您可以将最多U+FFFF的Unicode代码点放入单个代码单元,但要发送高于U + FFFF的代码点(需要2个代码单元),您必须提供2个INPUT值,每个代码单元一个:

INPUT input[2] = {0};
int numInput;

// character should be a 32bit codepoint and not exceed 0x10FFFF...
if (character <= 0xFFFF)
{
    input[0].type = INPUT_KEYBOARD;
    input[0].ki.wScan = (WORD) character;
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;

    numInput = 1;
}
else
{
    character -= 0x010000;

    input[0].type = INPUT_KEYBOARD;
    input[0].ki.wScan = (WORD) (((character >> 10) & 0x03FF) + 0xD800);
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;

    input[0].type = INPUT_KEYBOARD;
    input[1].ki.wScan = (WORD) ((character & 0x03FF) + 0xDC00);
    input[0].ki.dwFlags = KEYEVENTF_UNICODE;

    numInput = 2;
}

SendInput(numInput, input, sizeof(INPUT));

您可以将其包装在发送UTF-16编码输入字符串的函数中:

void SendInputStr(const std::wstring &str) // in C++11, use std::u16string instead...
{
    if (str.empty()) return;

    std::vector<INPUT> input(str.length());

    for (int i = 0; i < str.length(); ++i)
    {
        input[i].type = INPUT_KEYBOARD;
        input[i].ki.wScan = (WORD) str[i];
        input[i].ki.dwFlags = KEYEVENTF_UNICODE;
    }

    SendInput(input.size(), &input[0], sizeof(INPUT));
}