这是我的lexer.hpp文件:
/** * \file * \brief Lexical analysis * \author Igor Zashchelkin * * Lexical analysis API for MyLisp. */ #ifndef LEXER_H #define LEXER_H #include <string> #include <vector> /** * \brief Used to link token (Token) with it's type. * \date June 29, 2018 * * TokenNumber - 12, 5.4, -200 * TokenString - "hello, world" * TokenBoolean - true, false * TokenIdentifier - function name, variable name * TokenSeparator - ( ) , */ enum TokenType { TokenNumber /// \brief Tokens which store numeric data ,TokenString /// \brief Tokens which store symbolic data, that ends and starts at " ,TokenBoolean /// \brief Tokens which store only one state 1 or 0 ,TokenIdentifier /// \brief Tokens which link to something (variable, function) ,TokenSeparator /// \brief Tokens which splits logical parts of code }; /** * \brief Token's value type * \date June 29, 2018 * * Simply, wrap of std::string */ typedef std::string TokenValue; /** * \brief Minimal part of lexical analysis * \date June 29, 2018 * * Structured pair of TokenType and TokenValue (aka std::string) */ class Token { private: const TokenType type; /// \brief Token's type const TokenValue value; /// \brief Token's value public: Token(TokenType type, std::string value); /// \brief Constructor const TokenType getType(); /// /brief Getter for type property const TokenValue getValue(); /// \brief Getter for value property }; /** * \brief Lexical analysis API instance * \date June 29, 2018 */ class Lexer { private: std::string code; /// \brief Source code public: Lexer(std::string code); /// \brief Constructor /** * \brief Tokenize source code * \date June 29, 2018 * * Generate sequence of tokens (std::vector) from code property (std::string) */ std::vector tokenize(); }; #endif //LEXER_H
一切都很好(这是我的想法),但是当我运行
doxygen src/lexer/lexer.hppDoxygen生成空文档。 怎么了?
答案 0 :(得分:0)
您应该有一个doxygen配置文件(Doxyfile),可以使用doxygen -g
生成默认版本。在Doxyfile中,您可以设置所需的选项。然后,您可以仅从doxygen
中的doxygen Doxyfile
开始doxygen。
即使使用OP所使用的设置,我也得到了文档(以及许多警告消息,例如:“警告:忽略lexer.hpp文件第2行中的未知标记`file'”,表明有些错误)。
另请参阅doxygen文档。