Documentation ¶
Overview ¶
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
┏━┓ ┃┃ ┃ ━━━━┛
Filename | SkyLine_Backend_Module_Token_Constants Extension | .go ( golang source code file ) Purpose | Define constant definitions for string values of Tokens Directory | Modules/Backend/SkyTokens Modular Directory | github.com/SkyPenguinLabs/SkyLine/Modules/Backend/SkyTokens Package Name | SkyLine_Backend_Tokens
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Module Description / Learners Activity :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
This file helps define a tokens string value. For example, if our scanner comes across the byte of '=' ( which is 61 ) then the scanner will know to categorize this token
as the TOKEN_ASSIGN. In a sense, this module helps define the token's string values, what the scanner should see, the patterns for the scanner as well as the type patterns
and rule sets for tokens and keywords.
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
┏━┓ ┃┃ ┃ ━━━━┛
Filename | SkyLine_Backend_Module_Token_KeywordModels Extension | .go ( golang source code file ) Purpose | Defines a map to map specific identifiers to keywords within the language Directory | Modules/Backend/SkyTokens Modular Directory | github.com/SkyPenguinLabs/SkyLine/Modules/Backend/SkyTokens Package Name | SkyLine_Backend_Tokens
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Module Description / Learners Activity :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
This file helps better tell the scanner to categorize a specific identifier as a keyword if it matches. Well, instead of telling it this file rather contains a map that maps
specific identifiers or to their respected token values. This helps the scanner know what to parse if it is not a single character.
Tokens in SkyLine are the following ¶
Token Named Value And Map Result | Token Description | Tokens assigned -------------------------------- | ------------------ | ---------------- TOKEN_FUNCTION_DEFINE_LITERAL | defines function | define, func TOKEN_FUNCTION | allows functions | Func, function TOKEN_ALLOW | variable decl | allow, set, cause, let TOKEN_TRUE | boolean true | true, BOOLEANT TOKEN_FALSE | boolean false | false, BOOLEANF TOKEN_RETURN | return values | ret, return TOKEN_CONSTANT | constant variable | constant, const TOKEN_SWITCH | switch expression | switch, sw TOKEN_CASE | case expression | case, cs TOKEN_DEFAULT | default expression | default, df TOKEN_REGISTER | register library | register TOKEN_KEYWORD_ENGINE | SLC call | ENGINE TOKEN_IMPORT | import files | import TOKEN_FOR | for expression | for TOKEN_STRING | string | STRING TOKEN_INSIDE | within expression | in TOKEN_NULL | empty expression | null
/////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////////
┏━┓ ┃┃ ┃ ━━━━┛
Filename | SkyLine_Backend_Module_Token_KeywordModels Extension | .go ( golang source code file ) Purpose | Defines all models / structures for thids module Directory | Modules/Backend/SkyTokens Modular Directory | github.com/SkyPenguinLabs/SkyLine/Modules/Backend/SkyTokens Package Name | SkyLine_Backend_Tokens
::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::: Module Description / Learners Activity :::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::::
This file defines two type's; one which is an alias to the string data type known as SL_TokenDataType ( The data type of a token such as '=') and a structure defining what consists
of a token within the language. In this case it contains of a literal ( a string value of the keyword or representation '=') and the data type.
Index ¶
Constants ¶
const ( // New Data types TOKEN_BYTE = "BYTE" // Byte value TOKEN_BYTESTART = "CHARACTER" // ^ Call TOKEN_MODULECALL = "::" // imported object | Implemented TOKEN_JUMP = "JUMP" // Jump To @lable | Not implemented TOKEN_DEFAULT = "DEFAULT" // Default keyword | Implemented TOKEN_FOR = "FOR" // For loop token | Implemented TOKEN_IMPORT = "IMPORT" // Import | Implemented TOKEN_MODULE = "module" // Module | Implemented TOKEN_KEYWORD_ENGINE = "ENGINE" // ENGINE | Implemented TOKEN_ENGINE_TYPE = "ENGINE::ENVIRONMENT_MODIFIER->CALL:::>" // ENGINE ENV MODIFY | Implemented TOKEN_FOREACH = "FOREACH" // For every element | Implemented TOKEN_INSIDE = "IN" // Within range | Implemented TOKEN_REGISTER = "REGISTER" // STD LIB Registry | Implemented TOKEN_ILLEGAL = "ILLEGAL" // Illegal character | Implemented TOKEN_EOF = "EOF" // End Of File | Implemented TOKEN_IDENT = "TOKEN_IDENT" // Identifier | Implimented TOKEN_INT = "Integer" // TYPE integer | Implemented TOKEN_INTEGER8 = "Integer8" // TYPE integer | Implemented TOKEN_INTEGER16 = "Integer16" // TYPE integer | Implemented TOKEN_INTEGER32 = "Integer32" // TYPE integer | Implemented TOKEN_INTEGER64 = "Integer64" // TYPE integer | Implemented TOKEN_FLOAT = "FLOAT" // TYPE float | Implemented TOKEN_STRING = "STRING" // TYPE string | Implemented TOKEN_NULL = "NULL" // Type NULL | Implemented TOKEN_CONSTANT = "CONST" // Constant | Implemented TOKEN_FUNCTION = "FUNCTION" // Function | Implemented TOKEN_ALLOW = "SET" // let statement | Implemented TOKEN_TRUE = "TOKEN_TRUE" // boolean type true | Implemented TOKEN_FALSE = "FALSE" // boolean type false | Implemented TOKEN_IF = "IF" // If statement | Implemented TOKEN_ELSE = "ELSE" // Else statement | Implemented TOKEN_RETURN = "RETURN" // return statement | Implemented TOKEN_SWITCH = "SWITCH" // Switch statement | Implemented TOKEN_CASE = "CASE" // Case statement | Implemented TOKEN_REGEXP = "REGEXP" // Regex Type | Not implemented TOKEN_Lable = "@" // Lables | Not implemented TOKEN_LTEQ = "<=" // LT or equal to | Implemented TOKEN_GTEQ = ">=" // GT or equal to | Implemented TOKEN_ASTERISK_EQUALS = "*=" // Multiply equals | Implemented TOKEN_BANG = "!" // Boolean operator | Implemented TOKEN_ASSIGN = "=" // General assignment | Implemented TOKEN_PLUS = "+" // General operator | Implemented TOKEN_MINUS = "-" // General operator | Implemented TOKEN_ASTARISK = "*" // General operator | Implemented TOKEN_SLASH = "/" // General operator | Implemented TOKEN_LT = "<" // Boolean operator | Implemented TOKEN_GT = ">" // Boolean operator | Implemented TOKEN_EQ = "==" // Boolean operator | Implemented TOKEN_MINUS_EQUALS = "-=" // Minus equals | Implemented TOKEN_NEQ = "!=" // Boolean operator | Implemented TOKEN_DIVEQ = "/=" // Division operator | Implemented TOKEN_PERIOD = "." // Method Call | Implemented TOKEN_PLUS_EQUALS = "+=" // Plus equals | Implemented TOKEN_COMMA = "," // Seperation | Implemented TOKEN_SEMICOLON = ";" // SemiColon | Implemented TOKEN_COLON = ":" // Colon | Implemented TOKEN_LPAREN = "(" // Args start | Implemented TOKEN_RPAREN = ")" // Args end | Implemented TOKEN_LINE = "|" // Line con | Implemented TOKEN_LBRACE = "{" // Open f | Implemented TOKEN_RBRACE = "}" // Close f | Implemented TOKEN_LBRACKET = "[" // Open | Implemented TOKEN_RBRACKET = "]" // Close | Implemented TOKEN_OROR = "||" // Condition or or | Implemented TOKEN_ANDAND = "&&" // Boolean operator | Implemented TOKEN_BACKTICK = "`" // Backtick | Implemented TOKEN_POWEROF = "**" // General operator | Implemented TOKEN_MODULO = "%" // General operator | Implemented TOKEN_NEWLINE = '\n' // COND | Implemented TOKEN_PLUS_PLUS = "++" // Plus Plus | Implemented TOKEN_QUESTION = "?" // Question que | Not implemented TOKEN_DOTDOT = ".." // Range | Implemented TOKEN_CONTAINS = "~=" // Contains | Not implemented TOKEN_NOTCONTAIN = "!~" // Boolean operator | Not implemented TOKEN_MINUS_MINUS = "--" // Minus minus | Implemented TOKEN_BITWISE_OP_OR = "|" // Bitwise OR | Implemented TOKEN_BITWISE_OP_XOR = "^" // Bitwise XOR | Implemented TOKEN_BITWISE_OP_AND = "&" // Bitwise AND | Implemented TOKEN_BITWISE_OP_LSHIFT = "<<" // Bitwise Shift L | Implemented TOKEN_BITWISE_OP_RSHIFT = ">>" // Bitwise Shift R | Implemented TOKEN_PEEKASSIGN = ":=" // Peek assignment | Implemented TOKEN_FUNCTION_DEFINE_LITERAL = "DEFINE_FUNCTION" // Define function | Implemented )
Variables ¶
var SkyLine_Keywords = map[string]SL_TokenDataType{ "define": TOKEN_FUNCTION_DEFINE_LITERAL, "func": TOKEN_FUNCTION_DEFINE_LITERAL, "Func": TOKEN_FUNCTION, "function": TOKEN_FUNCTION, "let": TOKEN_FUNCTION, "set": TOKEN_ALLOW, "cause": TOKEN_ALLOW, "allow": TOKEN_ALLOW, "true": TOKEN_TRUE, "false": TOKEN_FALSE, "if": TOKEN_IF, "else": TOKEN_ELSE, "return": TOKEN_RETURN, "ret": TOKEN_RETURN, "const": TOKEN_CONSTANT, "constant": TOKEN_CONSTANT, "switch": TOKEN_SWITCH, "sw": TOKEN_SWITCH, "case": TOKEN_CASE, "cs": TOKEN_CASE, "default": TOKEN_DEFAULT, "df": TOKEN_DEFAULT, "register": TOKEN_REGISTER, "ENGINE": TOKEN_KEYWORD_ENGINE, "import": TOKEN_IMPORT, "for": TOKEN_FOR, "STRING": TOKEN_STRING, "BOOLEANT": TOKEN_TRUE, "BOOLEANF": TOKEN_FALSE, "foreach": TOKEN_FOREACH, "in": TOKEN_INSIDE, "null": TOKEN_NULL, "jmp": TOKEN_JUMP, "jump": TOKEN_JUMP, }
Functions ¶
This section is empty.
Types ¶
type SL_TokenConstruct ¶
type SL_TokenConstruct struct { Token_Type SL_TokenDataType Literal string }
type SL_TokenDataType ¶
type SL_TokenDataType string