solidity-antlr4
v2.7.1
Published
Solidity Lang Lexer and Parser by official ANTLR4 grammar
Downloads
71
Maintainers
Readme
Solidity Language Lexer and Parser, generated by official ANTLR4 grammar.
Change Log · Report Bug · Pull Request
Installation
$ npm install solidity-antlr4
It will be
pnpm/yarn add solidity-antlr4
if you use pnpm or yarn.
Usage
Language Parser
parse(code, [options])
:parse()
parses the provided code as an entire Solidity source unit.options
:tolerant
:boolean
, default isfalse
. Iftrue
, the parser will try to parse as much as possible, even if the input is invalid, and never throw an error.selector
:function
, default is(p) => p.sourceUnit()
. If provided, the parser will only return the nodes that match the selector. It will be useful when you want to parse a specific node.
output
:SyntaxNode
, the root node of the AST.
// parse.mjs
import { parse } from 'solidity-antlr4';
const code = `// SPDX-License-Identifier: MIT
pragma solidity ^0.8.20;
contract HelloWorld {
string public greet = "Hello World!";
}
`;
const ast = parse(code, { tolerant: true, selector: (p) => p.sourceUnit() });
// SourceUnit {
// type: 'SourceUnit',
// src: '32:88',
// range: [ 32, 120 ],
// location: Location {
// start: Position { line: 2, column: 0 },
// end: Position { line: 6, column: 0 }
// },
// context: SourceUnitContext {...},
// nodes: [
// PragmaDirective {
// type: 'PragmaDirective',
// literals: [Array]
// },
// ContractDefinition {
// type: 'ContractDefinition',
// name: [Identifier],
// contractKind: 'contract',
// abstract: false,
// baseContracts: [],
// nodes: [Array]
// }
// ]
// }
Tokenizer
tokenizer(code, [options])
:tokenizer()
parses the provided code as tokens.options
:tolerant
:boolean
, default isfalse
.
output
:SyntaxToken[]
.
// tokenizer.mjs
import { tokenizer } from 'solidity-antlr4';
const tokens = tokenizer(code, { tolerant: true });
// [
// {
// type: 'SourceUnit',
// src: '32:88',
// range: [ 32, 120 ],
// location: Location {
// start: Position { line: 2, column: 0 },
// end: Position { line: 6, column: 0 }
// }
// },
// ...
// ]
Traverse AST
We can use it alongside the parser to traverse nodes.
// visit.mjs
import { parse, visit, serialize } from 'solidity-antlr4';
const ast = parse(code);
// Use `visit` to traverse ast by enter/exit node type.
visit(ast, {
enter: ({ node, parent }) => {
console.log(node.type, parent?.type); // print node type
},
exit: () => {}, // will call when exit node
Identifier: ({ node: identifierNode }) => {
console.log(identifierNode.name); // print identifier name
},
exitContractDefinition: ({ node: contractDefinitionNode }) => {
// will call when exit ContractDefinition node
}
});
// Use `serialize` to modify ast.
const newAST = serialize(ast, ({ node }) => {
// do something
if (node.type === 'Identifier') {
return node.name;
}
return node;
})
// traverse.mjs
import { parse, traverse } from 'solidity-antlr4';
const ast = parse(code);
const newAST = traverse(ast, (path) => {
// path.path => `SourceUnit.ContractDefinition.FunctionDefinition` ...
// path.node => current node
// path.parentPath => parent node path
// path.depth => current node depth
// path.stop(); => stop traverse
// path.rewrite({...}); => rewrite current node
// path.matches({ type: 'xxx' }); => check if current node matches the given filter
// return () => {}; => will call when exit node
});
Low-level API
Not recommended, but you can use it if you want.
import { SolidityLexer, SolidityParser, CharStreams, CommonTokenStream } from 'solidity-antlr4';
const code = `...`; // code here
const input = CharStreams.fromString(code);
const lexer = new SolidityLexer(input);
const tokens = new CommonTokenStream(lexer);
const parser = new SolidityParser(tokens);
const parseTree = parser.sourceUnit();
// do something with parseTree